Nov 24 12:23:03 localhost kernel: Linux version 5.14.0-639.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025
Nov 24 12:23:03 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 24 12:23:03 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 12:23:03 localhost kernel: BIOS-provided physical RAM map:
Nov 24 12:23:03 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 24 12:23:03 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 24 12:23:03 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 24 12:23:03 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 24 12:23:03 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 24 12:23:03 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 24 12:23:03 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 24 12:23:03 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 24 12:23:03 localhost kernel: NX (Execute Disable) protection: active
Nov 24 12:23:03 localhost kernel: APIC: Static calls initialized
Nov 24 12:23:03 localhost kernel: SMBIOS 2.8 present.
Nov 24 12:23:03 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 24 12:23:03 localhost kernel: Hypervisor detected: KVM
Nov 24 12:23:03 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 24 12:23:03 localhost kernel: kvm-clock: using sched offset of 4908646812 cycles
Nov 24 12:23:03 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 24 12:23:03 localhost kernel: tsc: Detected 2799.998 MHz processor
Nov 24 12:23:03 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 24 12:23:03 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 24 12:23:03 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 24 12:23:03 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 24 12:23:03 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 24 12:23:03 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 24 12:23:03 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 24 12:23:03 localhost kernel: Using GB pages for direct mapping
Nov 24 12:23:03 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 24 12:23:03 localhost kernel: ACPI: Early table checksum verification disabled
Nov 24 12:23:03 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 24 12:23:03 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 12:23:03 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 12:23:03 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 12:23:03 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 24 12:23:03 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 12:23:03 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 12:23:03 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 24 12:23:03 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 24 12:23:03 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 24 12:23:03 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 24 12:23:03 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 24 12:23:03 localhost kernel: No NUMA configuration found
Nov 24 12:23:03 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 24 12:23:03 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 24 12:23:03 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 24 12:23:03 localhost kernel: Zone ranges:
Nov 24 12:23:03 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 24 12:23:03 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 24 12:23:03 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 24 12:23:03 localhost kernel:   Device   empty
Nov 24 12:23:03 localhost kernel: Movable zone start for each node
Nov 24 12:23:03 localhost kernel: Early memory node ranges
Nov 24 12:23:03 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 24 12:23:03 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 24 12:23:03 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 24 12:23:03 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 24 12:23:03 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 24 12:23:03 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 24 12:23:03 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 24 12:23:03 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 24 12:23:03 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 24 12:23:03 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 24 12:23:03 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 24 12:23:03 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 24 12:23:03 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 24 12:23:03 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 24 12:23:03 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 24 12:23:03 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 24 12:23:03 localhost kernel: TSC deadline timer available
Nov 24 12:23:03 localhost kernel: CPU topo: Max. logical packages:   8
Nov 24 12:23:03 localhost kernel: CPU topo: Max. logical dies:       8
Nov 24 12:23:03 localhost kernel: CPU topo: Max. dies per package:   1
Nov 24 12:23:03 localhost kernel: CPU topo: Max. threads per core:   1
Nov 24 12:23:03 localhost kernel: CPU topo: Num. cores per package:     1
Nov 24 12:23:03 localhost kernel: CPU topo: Num. threads per package:   1
Nov 24 12:23:03 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 24 12:23:03 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 24 12:23:03 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 24 12:23:03 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 24 12:23:03 localhost kernel: Booting paravirtualized kernel on KVM
Nov 24 12:23:03 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 24 12:23:03 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 24 12:23:03 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 24 12:23:03 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 24 12:23:03 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 24 12:23:03 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 24 12:23:03 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 12:23:03 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64", will be passed to user space.
Nov 24 12:23:03 localhost kernel: random: crng init done
Nov 24 12:23:03 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 24 12:23:03 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 24 12:23:03 localhost kernel: Fallback order for Node 0: 0 
Nov 24 12:23:03 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 24 12:23:03 localhost kernel: Policy zone: Normal
Nov 24 12:23:03 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 24 12:23:03 localhost kernel: software IO TLB: area num 8.
Nov 24 12:23:03 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 24 12:23:03 localhost kernel: ftrace: allocating 49298 entries in 193 pages
Nov 24 12:23:03 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 24 12:23:03 localhost kernel: Dynamic Preempt: voluntary
Nov 24 12:23:03 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 24 12:23:03 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 24 12:23:03 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 24 12:23:03 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 24 12:23:03 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 24 12:23:03 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 24 12:23:03 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 24 12:23:03 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 24 12:23:03 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 12:23:03 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 12:23:03 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 12:23:03 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 24 12:23:03 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 24 12:23:03 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 24 12:23:03 localhost kernel: Console: colour VGA+ 80x25
Nov 24 12:23:03 localhost kernel: printk: console [ttyS0] enabled
Nov 24 12:23:03 localhost kernel: ACPI: Core revision 20230331
Nov 24 12:23:03 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 24 12:23:03 localhost kernel: x2apic enabled
Nov 24 12:23:03 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 24 12:23:03 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 24 12:23:03 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 24 12:23:03 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 24 12:23:03 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 24 12:23:03 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 24 12:23:03 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 24 12:23:03 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 24 12:23:03 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 24 12:23:03 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 24 12:23:03 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 24 12:23:03 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 24 12:23:03 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 24 12:23:03 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 24 12:23:03 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 24 12:23:03 localhost kernel: x86/bugs: return thunk changed
Nov 24 12:23:03 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 24 12:23:03 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 24 12:23:03 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 24 12:23:03 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 24 12:23:03 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 24 12:23:03 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 24 12:23:03 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 24 12:23:03 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 24 12:23:03 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 24 12:23:03 localhost kernel: landlock: Up and running.
Nov 24 12:23:03 localhost kernel: Yama: becoming mindful.
Nov 24 12:23:03 localhost kernel: SELinux:  Initializing.
Nov 24 12:23:03 localhost kernel: LSM support for eBPF active
Nov 24 12:23:03 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 24 12:23:03 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 24 12:23:03 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 24 12:23:03 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 24 12:23:03 localhost kernel: ... version:                0
Nov 24 12:23:03 localhost kernel: ... bit width:              48
Nov 24 12:23:03 localhost kernel: ... generic registers:      6
Nov 24 12:23:03 localhost kernel: ... value mask:             0000ffffffffffff
Nov 24 12:23:03 localhost kernel: ... max period:             00007fffffffffff
Nov 24 12:23:03 localhost kernel: ... fixed-purpose events:   0
Nov 24 12:23:03 localhost kernel: ... event mask:             000000000000003f
Nov 24 12:23:03 localhost kernel: signal: max sigframe size: 1776
Nov 24 12:23:03 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 24 12:23:03 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 24 12:23:03 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 24 12:23:03 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 24 12:23:03 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 24 12:23:03 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 24 12:23:03 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 24 12:23:03 localhost kernel: node 0 deferred pages initialised in 10ms
Nov 24 12:23:03 localhost kernel: Memory: 7765676K/8388068K available (16384K kernel code, 5786K rwdata, 13900K rodata, 4188K init, 7176K bss, 616272K reserved, 0K cma-reserved)
Nov 24 12:23:03 localhost kernel: devtmpfs: initialized
Nov 24 12:23:03 localhost kernel: x86/mm: Memory block size: 128MB
Nov 24 12:23:03 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 24 12:23:03 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 24 12:23:03 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 24 12:23:03 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 24 12:23:03 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 24 12:23:03 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 24 12:23:03 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 24 12:23:03 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 24 12:23:03 localhost kernel: audit: type=2000 audit(1763986981.576:1): state=initialized audit_enabled=0 res=1
Nov 24 12:23:03 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 24 12:23:03 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 24 12:23:03 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 24 12:23:03 localhost kernel: cpuidle: using governor menu
Nov 24 12:23:03 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 24 12:23:03 localhost kernel: PCI: Using configuration type 1 for base access
Nov 24 12:23:03 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 24 12:23:03 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 24 12:23:03 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 24 12:23:03 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 24 12:23:03 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 24 12:23:03 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 24 12:23:03 localhost kernel: Demotion targets for Node 0: null
Nov 24 12:23:03 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 24 12:23:03 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 24 12:23:03 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 24 12:23:03 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 24 12:23:03 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 24 12:23:03 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 24 12:23:03 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 24 12:23:03 localhost kernel: ACPI: Interpreter enabled
Nov 24 12:23:03 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 24 12:23:03 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 24 12:23:03 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 24 12:23:03 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 24 12:23:03 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 24 12:23:03 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 24 12:23:03 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [3] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [4] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [5] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [6] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [7] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [8] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [9] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [10] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [11] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [12] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [13] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [14] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [15] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [16] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [17] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [18] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [19] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [20] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [21] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [22] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [23] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [24] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [25] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [26] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [27] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [28] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [29] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [30] registered
Nov 24 12:23:03 localhost kernel: acpiphp: Slot [31] registered
Nov 24 12:23:03 localhost kernel: PCI host bridge to bus 0000:00
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 24 12:23:03 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 24 12:23:03 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 24 12:23:03 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 24 12:23:03 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 24 12:23:03 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 24 12:23:03 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 24 12:23:03 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 24 12:23:03 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 24 12:23:03 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 24 12:23:03 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 24 12:23:03 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 24 12:23:03 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 24 12:23:03 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 24 12:23:03 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 24 12:23:03 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 24 12:23:03 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 24 12:23:03 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 24 12:23:03 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 24 12:23:03 localhost kernel: iommu: Default domain type: Translated
Nov 24 12:23:03 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 24 12:23:03 localhost kernel: SCSI subsystem initialized
Nov 24 12:23:03 localhost kernel: ACPI: bus type USB registered
Nov 24 12:23:03 localhost kernel: usbcore: registered new interface driver usbfs
Nov 24 12:23:03 localhost kernel: usbcore: registered new interface driver hub
Nov 24 12:23:03 localhost kernel: usbcore: registered new device driver usb
Nov 24 12:23:03 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 24 12:23:03 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 24 12:23:03 localhost kernel: PTP clock support registered
Nov 24 12:23:03 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 24 12:23:03 localhost kernel: NetLabel: Initializing
Nov 24 12:23:03 localhost kernel: NetLabel:  domain hash size = 128
Nov 24 12:23:03 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 24 12:23:03 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 24 12:23:03 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 24 12:23:03 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 24 12:23:03 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 24 12:23:03 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 24 12:23:03 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 24 12:23:03 localhost kernel: vgaarb: loaded
Nov 24 12:23:03 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 24 12:23:03 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 24 12:23:03 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 24 12:23:03 localhost kernel: pnp: PnP ACPI init
Nov 24 12:23:03 localhost kernel: pnp 00:03: [dma 2]
Nov 24 12:23:03 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 24 12:23:03 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 24 12:23:03 localhost kernel: NET: Registered PF_INET protocol family
Nov 24 12:23:03 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 24 12:23:03 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 24 12:23:03 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 24 12:23:03 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 24 12:23:03 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 24 12:23:03 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 24 12:23:03 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 24 12:23:03 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 24 12:23:03 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 24 12:23:03 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 24 12:23:03 localhost kernel: NET: Registered PF_XDP protocol family
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 24 12:23:03 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 24 12:23:03 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 24 12:23:03 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 24 12:23:03 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 81347 usecs
Nov 24 12:23:03 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 24 12:23:03 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 24 12:23:03 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 24 12:23:03 localhost kernel: ACPI: bus type thunderbolt registered
Nov 24 12:23:03 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 24 12:23:03 localhost kernel: Initialise system trusted keyrings
Nov 24 12:23:03 localhost kernel: Key type blacklist registered
Nov 24 12:23:03 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 24 12:23:03 localhost kernel: zbud: loaded
Nov 24 12:23:03 localhost kernel: integrity: Platform Keyring initialized
Nov 24 12:23:03 localhost kernel: integrity: Machine keyring initialized
Nov 24 12:23:03 localhost kernel: Freeing initrd memory: 85868K
Nov 24 12:23:03 localhost kernel: NET: Registered PF_ALG protocol family
Nov 24 12:23:03 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 24 12:23:03 localhost kernel: Key type asymmetric registered
Nov 24 12:23:03 localhost kernel: Asymmetric key parser 'x509' registered
Nov 24 12:23:03 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 24 12:23:03 localhost kernel: io scheduler mq-deadline registered
Nov 24 12:23:03 localhost kernel: io scheduler kyber registered
Nov 24 12:23:03 localhost kernel: io scheduler bfq registered
Nov 24 12:23:03 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 24 12:23:03 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 24 12:23:03 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 24 12:23:03 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 24 12:23:03 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 24 12:23:03 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 24 12:23:03 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 24 12:23:03 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 24 12:23:03 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 24 12:23:03 localhost kernel: Non-volatile memory driver v1.3
Nov 24 12:23:03 localhost kernel: rdac: device handler registered
Nov 24 12:23:03 localhost kernel: hp_sw: device handler registered
Nov 24 12:23:03 localhost kernel: emc: device handler registered
Nov 24 12:23:03 localhost kernel: alua: device handler registered
Nov 24 12:23:03 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 24 12:23:03 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 24 12:23:03 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 24 12:23:03 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 24 12:23:03 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 24 12:23:03 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 24 12:23:03 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 24 12:23:03 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-639.el9.x86_64 uhci_hcd
Nov 24 12:23:03 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 24 12:23:03 localhost kernel: hub 1-0:1.0: USB hub found
Nov 24 12:23:03 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 24 12:23:03 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 24 12:23:03 localhost kernel: usbserial: USB Serial support registered for generic
Nov 24 12:23:03 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 24 12:23:03 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 24 12:23:03 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 24 12:23:03 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 24 12:23:03 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 24 12:23:03 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 24 12:23:03 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 24 12:23:03 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 24 12:23:03 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-24T12:23:02 UTC (1763986982)
Nov 24 12:23:03 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 24 12:23:03 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 24 12:23:03 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 24 12:23:03 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 24 12:23:03 localhost kernel: usbcore: registered new interface driver usbhid
Nov 24 12:23:03 localhost kernel: usbhid: USB HID core driver
Nov 24 12:23:03 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 24 12:23:03 localhost kernel: Initializing XFRM netlink socket
Nov 24 12:23:03 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 24 12:23:03 localhost kernel: Segment Routing with IPv6
Nov 24 12:23:03 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 24 12:23:03 localhost kernel: mpls_gso: MPLS GSO support
Nov 24 12:23:03 localhost kernel: IPI shorthand broadcast: enabled
Nov 24 12:23:03 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 24 12:23:03 localhost kernel: AES CTR mode by8 optimization enabled
Nov 24 12:23:03 localhost kernel: sched_clock: Marking stable (1263007323, 146960202)->(1540113259, -130145734)
Nov 24 12:23:03 localhost kernel: registered taskstats version 1
Nov 24 12:23:03 localhost kernel: Loading compiled-in X.509 certificates
Nov 24 12:23:03 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 24 12:23:03 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 24 12:23:03 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 24 12:23:03 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 24 12:23:03 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 24 12:23:03 localhost kernel: Demotion targets for Node 0: null
Nov 24 12:23:03 localhost kernel: page_owner is disabled
Nov 24 12:23:03 localhost kernel: Key type .fscrypt registered
Nov 24 12:23:03 localhost kernel: Key type fscrypt-provisioning registered
Nov 24 12:23:03 localhost kernel: Key type big_key registered
Nov 24 12:23:03 localhost kernel: Key type encrypted registered
Nov 24 12:23:03 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 24 12:23:03 localhost kernel: Loading compiled-in module X.509 certificates
Nov 24 12:23:03 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 24 12:23:03 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 24 12:23:03 localhost kernel: ima: No architecture policies found
Nov 24 12:23:03 localhost kernel: evm: Initialising EVM extended attributes:
Nov 24 12:23:03 localhost kernel: evm: security.selinux
Nov 24 12:23:03 localhost kernel: evm: security.SMACK64 (disabled)
Nov 24 12:23:03 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 24 12:23:03 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 24 12:23:03 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 24 12:23:03 localhost kernel: evm: security.apparmor (disabled)
Nov 24 12:23:03 localhost kernel: evm: security.ima
Nov 24 12:23:03 localhost kernel: evm: security.capability
Nov 24 12:23:03 localhost kernel: evm: HMAC attrs: 0x1
Nov 24 12:23:03 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 24 12:23:03 localhost kernel: Running certificate verification RSA selftest
Nov 24 12:23:03 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 24 12:23:03 localhost kernel: Running certificate verification ECDSA selftest
Nov 24 12:23:03 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 24 12:23:03 localhost kernel: clk: Disabling unused clocks
Nov 24 12:23:03 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 24 12:23:03 localhost kernel: Freeing unused kernel image (initmem) memory: 4188K
Nov 24 12:23:03 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 24 12:23:03 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 24 12:23:03 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 24 12:23:03 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 24 12:23:03 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 24 12:23:03 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 24 12:23:03 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 24 12:23:03 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 24 12:23:03 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 24 12:23:03 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 24 12:23:03 localhost kernel: Run /init as init process
Nov 24 12:23:03 localhost kernel:   with arguments:
Nov 24 12:23:03 localhost kernel:     /init
Nov 24 12:23:03 localhost kernel:   with environment:
Nov 24 12:23:03 localhost kernel:     HOME=/
Nov 24 12:23:03 localhost kernel:     TERM=linux
Nov 24 12:23:03 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64
Nov 24 12:23:03 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 24 12:23:03 localhost systemd[1]: Detected virtualization kvm.
Nov 24 12:23:03 localhost systemd[1]: Detected architecture x86-64.
Nov 24 12:23:03 localhost systemd[1]: Running in initrd.
Nov 24 12:23:03 localhost systemd[1]: No hostname configured, using default hostname.
Nov 24 12:23:03 localhost systemd[1]: Hostname set to <localhost>.
Nov 24 12:23:03 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 24 12:23:03 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 24 12:23:03 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 24 12:23:03 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 24 12:23:03 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 24 12:23:03 localhost systemd[1]: Reached target Local File Systems.
Nov 24 12:23:03 localhost systemd[1]: Reached target Path Units.
Nov 24 12:23:03 localhost systemd[1]: Reached target Slice Units.
Nov 24 12:23:03 localhost systemd[1]: Reached target Swaps.
Nov 24 12:23:03 localhost systemd[1]: Reached target Timer Units.
Nov 24 12:23:03 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 24 12:23:03 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 24 12:23:03 localhost systemd[1]: Listening on Journal Socket.
Nov 24 12:23:03 localhost systemd[1]: Listening on udev Control Socket.
Nov 24 12:23:03 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 24 12:23:03 localhost systemd[1]: Reached target Socket Units.
Nov 24 12:23:03 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 24 12:23:03 localhost systemd[1]: Starting Journal Service...
Nov 24 12:23:03 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 24 12:23:03 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 24 12:23:03 localhost systemd[1]: Starting Create System Users...
Nov 24 12:23:03 localhost systemd[1]: Starting Setup Virtual Console...
Nov 24 12:23:03 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 24 12:23:03 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 24 12:23:03 localhost systemd[1]: Finished Create System Users.
Nov 24 12:23:03 localhost systemd-journald[306]: Journal started
Nov 24 12:23:03 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/44bc1d66b47f487f828ac6d054feec1c) is 8.0M, max 153.6M, 145.6M free.
Nov 24 12:23:03 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 24 12:23:03 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 24 12:23:03 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 24 12:23:03 localhost systemd[1]: Started Journal Service.
Nov 24 12:23:03 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 24 12:23:03 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 24 12:23:03 localhost systemd[1]: Finished Setup Virtual Console.
Nov 24 12:23:03 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 24 12:23:03 localhost systemd[1]: Starting dracut cmdline hook...
Nov 24 12:23:03 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 24 12:23:03 localhost dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Nov 24 12:23:03 localhost dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 12:23:03 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 24 12:23:03 localhost systemd[1]: Finished dracut cmdline hook.
Nov 24 12:23:03 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 24 12:23:03 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 24 12:23:03 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 24 12:23:03 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 24 12:23:03 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 24 12:23:03 localhost kernel: RPC: Registered udp transport module.
Nov 24 12:23:03 localhost kernel: RPC: Registered tcp transport module.
Nov 24 12:23:03 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 24 12:23:03 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 24 12:23:03 localhost rpc.statd[445]: Version 2.5.4 starting
Nov 24 12:23:03 localhost rpc.statd[445]: Initializing NSM state
Nov 24 12:23:03 localhost rpc.idmapd[450]: Setting log level to 0
Nov 24 12:23:03 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 24 12:23:03 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 24 12:23:03 localhost systemd-udevd[463]: Using default interface naming scheme 'rhel-9.0'.
Nov 24 12:23:03 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 24 12:23:03 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 24 12:23:03 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 24 12:23:03 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 24 12:23:04 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 24 12:23:04 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 12:23:04 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 24 12:23:04 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 12:23:04 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 12:23:04 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 24 12:23:04 localhost systemd[1]: Reached target Network.
Nov 24 12:23:04 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 24 12:23:04 localhost systemd[1]: Starting dracut initqueue hook...
Nov 24 12:23:04 localhost kernel: libata version 3.00 loaded.
Nov 24 12:23:04 localhost systemd-udevd[490]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 12:23:04 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 24 12:23:04 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 24 12:23:04 localhost kernel: scsi host0: ata_piix
Nov 24 12:23:04 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 24 12:23:04 localhost kernel:  vda: vda1
Nov 24 12:23:04 localhost kernel: scsi host1: ata_piix
Nov 24 12:23:04 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 24 12:23:04 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 24 12:23:04 localhost systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 24 12:23:04 localhost systemd[1]: Reached target Initrd Root Device.
Nov 24 12:23:04 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 24 12:23:04 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 24 12:23:04 localhost systemd[1]: Reached target System Initialization.
Nov 24 12:23:04 localhost systemd[1]: Reached target Basic System.
Nov 24 12:23:04 localhost kernel: ata1: found unknown device (class 0)
Nov 24 12:23:04 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 24 12:23:04 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 24 12:23:04 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 24 12:23:04 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 24 12:23:04 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 24 12:23:04 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 24 12:23:04 localhost systemd[1]: Finished dracut initqueue hook.
Nov 24 12:23:04 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 24 12:23:04 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 24 12:23:04 localhost systemd[1]: Reached target Remote File Systems.
Nov 24 12:23:04 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 24 12:23:04 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 24 12:23:04 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 24 12:23:04 localhost systemd-fsck[558]: /usr/sbin/fsck.xfs: XFS file system.
Nov 24 12:23:04 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 24 12:23:04 localhost systemd[1]: Mounting /sysroot...
Nov 24 12:23:05 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 24 12:23:05 localhost kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 24 12:23:05 localhost kernel: XFS (vda1): Ending clean mount
Nov 24 12:23:05 localhost systemd[1]: Mounted /sysroot.
Nov 24 12:23:05 localhost systemd[1]: Reached target Initrd Root File System.
Nov 24 12:23:05 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 24 12:23:05 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 24 12:23:05 localhost systemd[1]: Reached target Initrd File Systems.
Nov 24 12:23:05 localhost systemd[1]: Reached target Initrd Default Target.
Nov 24 12:23:05 localhost systemd[1]: Starting dracut mount hook...
Nov 24 12:23:05 localhost systemd[1]: Finished dracut mount hook.
Nov 24 12:23:05 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 24 12:23:05 localhost rpc.idmapd[450]: exiting on signal 15
Nov 24 12:23:05 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 24 12:23:05 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 24 12:23:05 localhost systemd[1]: Stopped target Network.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Timer Units.
Nov 24 12:23:05 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 24 12:23:05 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Basic System.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Path Units.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Remote File Systems.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Slice Units.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Socket Units.
Nov 24 12:23:05 localhost systemd[1]: Stopped target System Initialization.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Local File Systems.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Swaps.
Nov 24 12:23:05 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped dracut mount hook.
Nov 24 12:23:05 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 24 12:23:05 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 24 12:23:05 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 24 12:23:05 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 24 12:23:05 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 24 12:23:05 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 24 12:23:05 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 24 12:23:05 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 24 12:23:05 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 24 12:23:05 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 24 12:23:05 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 24 12:23:05 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 24 12:23:05 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Closed udev Control Socket.
Nov 24 12:23:05 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Closed udev Kernel Socket.
Nov 24 12:23:05 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 24 12:23:05 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 24 12:23:05 localhost systemd[1]: Starting Cleanup udev Database...
Nov 24 12:23:05 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 24 12:23:05 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 24 12:23:05 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Stopped Create System Users.
Nov 24 12:23:05 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 24 12:23:05 localhost systemd[1]: Finished Cleanup udev Database.
Nov 24 12:23:05 localhost systemd[1]: Reached target Switch Root.
Nov 24 12:23:05 localhost systemd[1]: Starting Switch Root...
Nov 24 12:23:05 localhost systemd[1]: Switching root.
Nov 24 12:23:05 localhost systemd-journald[306]: Journal stopped
Nov 24 12:23:06 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Nov 24 12:23:06 localhost kernel: audit: type=1404 audit(1763986985.697:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 24 12:23:06 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 12:23:06 localhost kernel: SELinux:  policy capability open_perms=1
Nov 24 12:23:06 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 12:23:06 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 24 12:23:06 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 12:23:06 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 12:23:06 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 12:23:06 localhost kernel: audit: type=1403 audit(1763986985.837:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 24 12:23:06 localhost systemd[1]: Successfully loaded SELinux policy in 143.342ms.
Nov 24 12:23:06 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.824ms.
Nov 24 12:23:06 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 24 12:23:06 localhost systemd[1]: Detected virtualization kvm.
Nov 24 12:23:06 localhost systemd[1]: Detected architecture x86-64.
Nov 24 12:23:06 localhost systemd-rc-local-generator[640]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:23:06 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 24 12:23:06 localhost systemd[1]: Stopped Switch Root.
Nov 24 12:23:06 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 24 12:23:06 localhost systemd[1]: Created slice Slice /system/getty.
Nov 24 12:23:06 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 24 12:23:06 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 24 12:23:06 localhost systemd[1]: Created slice User and Session Slice.
Nov 24 12:23:06 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 24 12:23:06 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 24 12:23:06 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 24 12:23:06 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 24 12:23:06 localhost systemd[1]: Stopped target Switch Root.
Nov 24 12:23:06 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 24 12:23:06 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 24 12:23:06 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 24 12:23:06 localhost systemd[1]: Reached target Path Units.
Nov 24 12:23:06 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 24 12:23:06 localhost systemd[1]: Reached target Slice Units.
Nov 24 12:23:06 localhost systemd[1]: Reached target Swaps.
Nov 24 12:23:06 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 24 12:23:06 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 24 12:23:06 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 24 12:23:06 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 24 12:23:06 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 24 12:23:06 localhost systemd[1]: Listening on udev Control Socket.
Nov 24 12:23:06 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 24 12:23:06 localhost systemd[1]: Mounting Huge Pages File System...
Nov 24 12:23:06 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 24 12:23:06 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 24 12:23:06 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 24 12:23:06 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 24 12:23:06 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 24 12:23:06 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 12:23:06 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 24 12:23:06 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 24 12:23:06 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 24 12:23:06 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 24 12:23:06 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 24 12:23:06 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 24 12:23:06 localhost systemd[1]: Stopped Journal Service.
Nov 24 12:23:06 localhost systemd[1]: Starting Journal Service...
Nov 24 12:23:06 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 24 12:23:06 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 24 12:23:06 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 12:23:06 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 24 12:23:06 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 24 12:23:06 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 24 12:23:06 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 24 12:23:06 localhost kernel: fuse: init (API version 7.37)
Nov 24 12:23:06 localhost systemd[1]: Mounted Huge Pages File System.
Nov 24 12:23:06 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 24 12:23:06 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 24 12:23:06 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 24 12:23:06 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 24 12:23:06 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 24 12:23:06 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 12:23:06 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 12:23:06 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 24 12:23:06 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 24 12:23:06 localhost systemd-journald[681]: Journal started
Nov 24 12:23:06 localhost systemd-journald[681]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 24 12:23:06 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 24 12:23:06 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 24 12:23:06 localhost systemd[1]: Started Journal Service.
Nov 24 12:23:06 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 24 12:23:06 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 24 12:23:06 localhost kernel: ACPI: bus type drm_connector registered
Nov 24 12:23:06 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 24 12:23:06 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 24 12:23:06 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 24 12:23:06 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 24 12:23:06 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 24 12:23:06 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 24 12:23:06 localhost systemd[1]: Mounting FUSE Control File System...
Nov 24 12:23:06 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 24 12:23:06 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 24 12:23:06 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 24 12:23:06 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 24 12:23:06 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 24 12:23:06 localhost systemd[1]: Starting Create System Users...
Nov 24 12:23:06 localhost systemd[1]: Mounted FUSE Control File System.
Nov 24 12:23:06 localhost systemd-journald[681]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 24 12:23:06 localhost systemd-journald[681]: Received client request to flush runtime journal.
Nov 24 12:23:06 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 24 12:23:06 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 24 12:23:06 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 24 12:23:06 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 24 12:23:06 localhost systemd[1]: Finished Create System Users.
Nov 24 12:23:06 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 24 12:23:06 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 24 12:23:06 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 24 12:23:06 localhost systemd[1]: Reached target Local File Systems.
Nov 24 12:23:06 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 24 12:23:06 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 24 12:23:06 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 24 12:23:06 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 24 12:23:06 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 24 12:23:06 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 24 12:23:06 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 24 12:23:06 localhost bootctl[699]: Couldn't find EFI system partition, skipping.
Nov 24 12:23:06 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 24 12:23:06 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 24 12:23:06 localhost systemd[1]: Starting Security Auditing Service...
Nov 24 12:23:06 localhost systemd[1]: Starting RPC Bind...
Nov 24 12:23:06 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 24 12:23:06 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 24 12:23:06 localhost auditd[705]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 24 12:23:06 localhost auditd[705]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 24 12:23:07 localhost systemd[1]: Started RPC Bind.
Nov 24 12:23:07 localhost augenrules[710]: /sbin/augenrules: No change
Nov 24 12:23:07 localhost augenrules[725]: No rules
Nov 24 12:23:07 localhost augenrules[725]: enabled 1
Nov 24 12:23:07 localhost augenrules[725]: failure 1
Nov 24 12:23:07 localhost augenrules[725]: pid 705
Nov 24 12:23:07 localhost augenrules[725]: rate_limit 0
Nov 24 12:23:07 localhost augenrules[725]: backlog_limit 8192
Nov 24 12:23:07 localhost augenrules[725]: lost 0
Nov 24 12:23:07 localhost augenrules[725]: backlog 3
Nov 24 12:23:07 localhost augenrules[725]: backlog_wait_time 60000
Nov 24 12:23:07 localhost augenrules[725]: backlog_wait_time_actual 0
Nov 24 12:23:07 localhost augenrules[725]: enabled 1
Nov 24 12:23:07 localhost augenrules[725]: failure 1
Nov 24 12:23:07 localhost augenrules[725]: pid 705
Nov 24 12:23:07 localhost augenrules[725]: rate_limit 0
Nov 24 12:23:07 localhost augenrules[725]: backlog_limit 8192
Nov 24 12:23:07 localhost augenrules[725]: lost 0
Nov 24 12:23:07 localhost augenrules[725]: backlog 0
Nov 24 12:23:07 localhost augenrules[725]: backlog_wait_time 60000
Nov 24 12:23:07 localhost augenrules[725]: backlog_wait_time_actual 0
Nov 24 12:23:07 localhost augenrules[725]: enabled 1
Nov 24 12:23:07 localhost augenrules[725]: failure 1
Nov 24 12:23:07 localhost augenrules[725]: pid 705
Nov 24 12:23:07 localhost augenrules[725]: rate_limit 0
Nov 24 12:23:07 localhost augenrules[725]: backlog_limit 8192
Nov 24 12:23:07 localhost augenrules[725]: lost 0
Nov 24 12:23:07 localhost augenrules[725]: backlog 3
Nov 24 12:23:07 localhost augenrules[725]: backlog_wait_time 60000
Nov 24 12:23:07 localhost augenrules[725]: backlog_wait_time_actual 0
Nov 24 12:23:07 localhost systemd[1]: Started Security Auditing Service.
Nov 24 12:23:07 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 24 12:23:07 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 24 12:23:07 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 24 12:23:07 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 24 12:23:07 localhost systemd-udevd[733]: Using default interface naming scheme 'rhel-9.0'.
Nov 24 12:23:07 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 24 12:23:07 localhost systemd[1]: Starting Update is Completed...
Nov 24 12:23:07 localhost systemd[1]: Finished Update is Completed.
Nov 24 12:23:07 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 24 12:23:07 localhost systemd[1]: Reached target System Initialization.
Nov 24 12:23:07 localhost systemd[1]: Started dnf makecache --timer.
Nov 24 12:23:07 localhost systemd[1]: Started Daily rotation of log files.
Nov 24 12:23:07 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 24 12:23:07 localhost systemd[1]: Reached target Timer Units.
Nov 24 12:23:07 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 24 12:23:07 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 24 12:23:07 localhost systemd[1]: Reached target Socket Units.
Nov 24 12:23:07 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 24 12:23:07 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 12:23:07 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 24 12:23:07 localhost systemd-udevd[737]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 12:23:07 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 12:23:07 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 12:23:07 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 12:23:07 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 24 12:23:07 localhost systemd[1]: Reached target Basic System.
Nov 24 12:23:07 localhost dbus-broker-lau[771]: Ready
Nov 24 12:23:07 localhost systemd[1]: Starting NTP client/server...
Nov 24 12:23:07 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 24 12:23:08 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 24 12:23:08 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 24 12:23:08 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 24 12:23:08 localhost chronyd[790]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 24 12:23:08 localhost chronyd[790]: Loaded 0 symmetric keys
Nov 24 12:23:08 localhost chronyd[790]: Using right/UTC timezone to obtain leap second data
Nov 24 12:23:08 localhost chronyd[790]: Loaded seccomp filter (level 2)
Nov 24 12:23:08 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 24 12:23:08 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 24 12:23:08 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 24 12:23:08 localhost kernel: kvm_amd: TSC scaling supported
Nov 24 12:23:08 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 24 12:23:08 localhost kernel: kvm_amd: Nested Paging enabled
Nov 24 12:23:08 localhost kernel: kvm_amd: LBR virtualization supported
Nov 24 12:23:08 localhost kernel: Console: switching to colour dummy device 80x25
Nov 24 12:23:08 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 24 12:23:08 localhost kernel: [drm] features: -context_init
Nov 24 12:23:08 localhost kernel: [drm] number of scanouts: 1
Nov 24 12:23:08 localhost kernel: [drm] number of cap sets: 0
Nov 24 12:23:08 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 24 12:23:08 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 24 12:23:08 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 24 12:23:08 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 24 12:23:08 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 24 12:23:08 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 24 12:23:08 localhost systemd[1]: Started irqbalance daemon.
Nov 24 12:23:08 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 24 12:23:08 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 12:23:08 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 12:23:08 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 12:23:08 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 24 12:23:08 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 24 12:23:08 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 24 12:23:08 localhost systemd[1]: Starting User Login Management...
Nov 24 12:23:08 localhost systemd[1]: Started NTP client/server.
Nov 24 12:23:08 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 24 12:23:08 localhost systemd-logind[815]: New seat seat0.
Nov 24 12:23:08 localhost systemd-logind[815]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 24 12:23:08 localhost systemd-logind[815]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 24 12:23:08 localhost systemd[1]: Started User Login Management.
Nov 24 12:23:08 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 24 12:23:08 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 24 12:23:08 localhost iptables.init[800]: iptables: Applying firewall rules: [  OK  ]
Nov 24 12:23:08 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 24 12:23:09 localhost cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 24 Nov 2025 12:23:09 +0000. Up 7.80 seconds.
Nov 24 12:23:09 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 24 12:23:09 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 24 12:23:09 localhost systemd[1]: run-cloud\x2dinit-tmp-tmphj0f89wc.mount: Deactivated successfully.
Nov 24 12:23:09 localhost systemd[1]: Starting Hostname Service...
Nov 24 12:23:09 localhost systemd[1]: Started Hostname Service.
Nov 24 12:23:09 np0005533537.novalocal systemd-hostnamed[855]: Hostname set to <np0005533537.novalocal> (static)
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Reached target Preparation for Network.
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Starting Network Manager...
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7170] NetworkManager (version 1.54.1-1.el9) is starting... (boot:519b3143-8899-40d3-b573-09a79f21923a)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7175] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7326] manager[0x563df8e1f080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7378] hostname: hostname: using hostnamed
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7379] hostname: static hostname changed from (none) to "np0005533537.novalocal"
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7383] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7714] manager[0x563df8e1f080]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7717] manager[0x563df8e1f080]: rfkill: WWAN hardware radio set enabled
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7823] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7824] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7826] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7827] manager: Networking is enabled by state file
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7831] settings: Loaded settings plugin: keyfile (internal)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7864] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7900] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7935] dhcp: init: Using DHCP client 'internal'
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7940] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7963] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7982] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.7994] device (lo): Activation: starting connection 'lo' (9d06024e-4e17-4e2e-898d-e229a91ed6b5)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8010] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8016] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8057] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8064] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8068] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8072] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8075] device (eth0): carrier: link connected
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8080] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8092] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Started Network Manager.
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8103] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8110] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8111] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Reached target Network.
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8116] manager: NetworkManager state is now CONNECTING
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8118] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8129] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8134] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8202] dhcp4 (eth0): state changed new lease, address=38.102.83.173
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8214] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8246] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8313] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8316] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8327] device (lo): Activation: successful, device activated.
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8339] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8341] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8346] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8352] device (eth0): Activation: successful, device activated.
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8361] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 12:23:09 np0005533537.novalocal NetworkManager[859]: <info>  [1763986989.8366] manager: startup complete
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Reached target NFS client services.
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Reached target Remote File Systems.
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 24 12:23:09 np0005533537.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 24 Nov 2025 12:23:10 +0000. Up 8.89 seconds.
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.173         | 255.255.255.0 | global | fa:16:3e:21:4b:69 |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe21:4b69/64 |       .       |  link  | fa:16:3e:21:4b:69 |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 24 12:23:10 np0005533537.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 12:23:11 np0005533537.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Nov 24 12:23:11 np0005533537.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 24 12:23:11 np0005533537.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Nov 24 12:23:11 np0005533537.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Nov 24 12:23:11 np0005533537.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Nov 24 12:23:11 np0005533537.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Generating public/private rsa key pair.
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: The key fingerprint is:
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: SHA256:IvDR2sy+7WgEgjMwwXuK4Hd+mMdTWYOReHGQmjsriCY root@np0005533537.novalocal
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: The key's randomart image is:
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: +---[RSA 3072]----+
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |o.      .o=.     |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |o.   . . =.      |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |.oo . . + o      |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |=..+.* o . o     |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |+oo.+.= S o .    |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |.o . +.+ o       |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |  o +.= +        |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |Eo . =.X         |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |o    .*.+        |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: +----[SHA256]-----+
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: The key fingerprint is:
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: SHA256:YttwcskNH/UZWGY3qPPbw6PGpAonpGPvzSsMxBccJrE root@np0005533537.novalocal
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: The key's randomart image is:
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: +---[ECDSA 256]---+
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |     ooo.   .o*..|
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |      +o   ..= +.|
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |    .E  o . . o  |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |     o o = +     |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |    . *.S o o    |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |     ooO     o   |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |     ++o..  + +  |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |    . oo*  . + = |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |      .o.=o ... o|
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: +----[SHA256]-----+
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: The key fingerprint is:
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: SHA256:I9J6YfmcRmMGNv3MYH2wCXuQ2IPqEP6YdAiLmo4bZDs root@np0005533537.novalocal
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: The key's randomart image is:
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: +--[ED25519 256]--+
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |       +o..      |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |. .   o.+= +     |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |.+ o .+ =.= .    |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |o = oo = * .     |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |.= B. * S +      |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |= + o+ O +       |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |+E  . . =        |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |.o.  . .         |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: |..               |
Nov 24 12:23:11 np0005533537.novalocal cloud-init[922]: +----[SHA256]-----+
Nov 24 12:23:11 np0005533537.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 24 12:23:11 np0005533537.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 24 12:23:11 np0005533537.novalocal systemd[1]: Reached target Network is Online.
Nov 24 12:23:11 np0005533537.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Starting System Logging Service...
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 24 12:23:12 np0005533537.novalocal sm-notify[1005]: Version 2.5.4 starting
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Starting Permit User Sessions...
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Finished Permit User Sessions.
Nov 24 12:23:12 np0005533537.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Nov 24 12:23:12 np0005533537.novalocal sshd[1007]: Server listening on :: port 22.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Started Command Scheduler.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Started Getty on tty1.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Reached target Login Prompts.
Nov 24 12:23:12 np0005533537.novalocal crond[1010]: (CRON) STARTUP (1.5.7)
Nov 24 12:23:12 np0005533537.novalocal crond[1010]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 24 12:23:12 np0005533537.novalocal crond[1010]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 8% if used.)
Nov 24 12:23:12 np0005533537.novalocal crond[1010]: (CRON) INFO (running with inotify support)
Nov 24 12:23:12 np0005533537.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Nov 24 12:23:12 np0005533537.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Started System Logging Service.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Reached target Multi-User System.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 24 12:23:12 np0005533537.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 12:23:12 np0005533537.novalocal kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Nov 24 12:23:12 np0005533537.novalocal kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-639.el9.x86_64kdump.img
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1138]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 24 Nov 2025 12:23:12 +0000. Up 11.02 seconds.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1265]: Unable to negotiate with 38.102.83.114 port 52564: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 24 12:23:12 np0005533537.novalocal dracut[1273]: dracut-057-102.git20250818.el9
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1275]: Unable to negotiate with 38.102.83.114 port 52586: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1290]: Unable to negotiate with 38.102.83.114 port 52602: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1255]: Connection closed by 38.102.83.114 port 52552 [preauth]
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1295]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 24 Nov 2025 12:23:12 +0000. Up 11.44 seconds.
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1269]: Connection closed by 38.102.83.114 port 52576 [preauth]
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1299]: Unable to negotiate with 38.102.83.114 port 52620: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1302]: Unable to negotiate with 38.102.83.114 port 52622: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1303]: #############################################################
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1305]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1307]: 256 SHA256:YttwcskNH/UZWGY3qPPbw6PGpAonpGPvzSsMxBccJrE root@np0005533537.novalocal (ECDSA)
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1311]: 256 SHA256:I9J6YfmcRmMGNv3MYH2wCXuQ2IPqEP6YdAiLmo4bZDs root@np0005533537.novalocal (ED25519)
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1294]: Connection closed by 38.102.83.114 port 52614 [preauth]
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1314]: 3072 SHA256:IvDR2sy+7WgEgjMwwXuK4Hd+mMdTWYOReHGQmjsriCY root@np0005533537.novalocal (RSA)
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1316]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1318]: #############################################################
Nov 24 12:23:12 np0005533537.novalocal sshd-session[1297]: Connection closed by 38.102.83.114 port 52616 [preauth]
Nov 24 12:23:12 np0005533537.novalocal dracut[1277]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-639.el9.x86_64kdump.img 5.14.0-639.el9.x86_64
Nov 24 12:23:12 np0005533537.novalocal cloud-init[1295]: Cloud-init v. 24.4-7.el9 finished at Mon, 24 Nov 2025 12:23:12 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.65 seconds
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 24 12:23:12 np0005533537.novalocal systemd[1]: Reached target Cloud-init target.
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 24 12:23:13 np0005533537.novalocal dracut[1277]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: memstrack is not available
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: memstrack is not available
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 24 12:23:14 np0005533537.novalocal dracut[1277]: *** Including module: systemd ***
Nov 24 12:23:14 np0005533537.novalocal chronyd[790]: Selected source 174.142.148.226 (2.centos.pool.ntp.org)
Nov 24 12:23:14 np0005533537.novalocal chronyd[790]: System clock TAI offset set to 37 seconds
Nov 24 12:23:15 np0005533537.novalocal dracut[1277]: *** Including module: fips ***
Nov 24 12:23:15 np0005533537.novalocal dracut[1277]: *** Including module: systemd-initrd ***
Nov 24 12:23:15 np0005533537.novalocal dracut[1277]: *** Including module: i18n ***
Nov 24 12:23:15 np0005533537.novalocal dracut[1277]: *** Including module: drm ***
Nov 24 12:23:15 np0005533537.novalocal dracut[1277]: *** Including module: prefixdevname ***
Nov 24 12:23:15 np0005533537.novalocal dracut[1277]: *** Including module: kernel-modules ***
Nov 24 12:23:16 np0005533537.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 24 12:23:16 np0005533537.novalocal chronyd[790]: Selected source 167.160.187.179 (2.centos.pool.ntp.org)
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]: *** Including module: kernel-modules-extra ***
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]: *** Including module: qemu ***
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]: *** Including module: fstab-sys ***
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]: *** Including module: rootfs-block ***
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]: *** Including module: terminfo ***
Nov 24 12:23:16 np0005533537.novalocal dracut[1277]: *** Including module: udev-rules ***
Nov 24 12:23:17 np0005533537.novalocal dracut[1277]: Skipping udev rule: 91-permissions.rules
Nov 24 12:23:17 np0005533537.novalocal dracut[1277]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 24 12:23:17 np0005533537.novalocal dracut[1277]: *** Including module: virtiofs ***
Nov 24 12:23:17 np0005533537.novalocal dracut[1277]: *** Including module: dracut-systemd ***
Nov 24 12:23:17 np0005533537.novalocal dracut[1277]: *** Including module: usrmount ***
Nov 24 12:23:17 np0005533537.novalocal dracut[1277]: *** Including module: base ***
Nov 24 12:23:17 np0005533537.novalocal dracut[1277]: *** Including module: fs-lib ***
Nov 24 12:23:17 np0005533537.novalocal dracut[1277]: *** Including module: kdumpbase ***
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:   microcode_ctl module: mangling fw_dir
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]: *** Including module: openssl ***
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]: *** Including module: shutdown ***
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]: *** Including module: squash ***
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]: *** Including modules done ***
Nov 24 12:23:18 np0005533537.novalocal dracut[1277]: *** Installing kernel module dependencies ***
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: Cannot change IRQ 35 affinity: Operation not permitted
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: IRQ 35 affinity is now unmanaged
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: Cannot change IRQ 33 affinity: Operation not permitted
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: IRQ 33 affinity is now unmanaged
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: IRQ 31 affinity is now unmanaged
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: IRQ 28 affinity is now unmanaged
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: Cannot change IRQ 34 affinity: Operation not permitted
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: IRQ 34 affinity is now unmanaged
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: IRQ 32 affinity is now unmanaged
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: IRQ 30 affinity is now unmanaged
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 24 12:23:19 np0005533537.novalocal irqbalance[801]: IRQ 29 affinity is now unmanaged
Nov 24 12:23:19 np0005533537.novalocal dracut[1277]: *** Installing kernel module dependencies done ***
Nov 24 12:23:19 np0005533537.novalocal dracut[1277]: *** Resolving executable dependencies ***
Nov 24 12:23:19 np0005533537.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 12:23:21 np0005533537.novalocal dracut[1277]: *** Resolving executable dependencies done ***
Nov 24 12:23:21 np0005533537.novalocal dracut[1277]: *** Generating early-microcode cpio image ***
Nov 24 12:23:21 np0005533537.novalocal dracut[1277]: *** Store current command line parameters ***
Nov 24 12:23:21 np0005533537.novalocal dracut[1277]: Stored kernel commandline:
Nov 24 12:23:21 np0005533537.novalocal dracut[1277]: No dracut internal kernel commandline stored in the initramfs
Nov 24 12:23:21 np0005533537.novalocal dracut[1277]: *** Install squash loader ***
Nov 24 12:23:23 np0005533537.novalocal dracut[1277]: *** Squashing the files inside the initramfs ***
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: *** Squashing the files inside the initramfs done ***
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: *** Creating image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' ***
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: *** Hardlinking files ***
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: Mode:           real
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: Files:          50
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: Linked:         0 files
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: Compared:       0 xattrs
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: Compared:       0 files
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: Saved:          0 B
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: Duration:       0.000507 seconds
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: *** Hardlinking files done ***
Nov 24 12:23:24 np0005533537.novalocal dracut[1277]: *** Creating initramfs image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' done ***
Nov 24 12:23:25 np0005533537.novalocal kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Nov 24 12:23:25 np0005533537.novalocal kdumpctl[1015]: kdump: Starting kdump: [OK]
Nov 24 12:23:25 np0005533537.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 24 12:23:25 np0005533537.novalocal systemd[1]: Startup finished in 1.670s (kernel) + 2.729s (initrd) + 19.426s (userspace) = 23.826s.
Nov 24 12:23:30 np0005533537.novalocal sshd-session[4295]: Accepted publickey for zuul from 38.102.83.114 port 36212 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 24 12:23:30 np0005533537.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 24 12:23:30 np0005533537.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 24 12:23:30 np0005533537.novalocal systemd-logind[815]: New session 1 of user zuul.
Nov 24 12:23:30 np0005533537.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 24 12:23:30 np0005533537.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Queued start job for default target Main User Target.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Created slice User Application Slice.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Reached target Paths.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Reached target Timers.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Starting D-Bus User Message Bus Socket...
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Starting Create User's Volatile Files and Directories...
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Finished Create User's Volatile Files and Directories.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Listening on D-Bus User Message Bus Socket.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Reached target Sockets.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Reached target Basic System.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Reached target Main User Target.
Nov 24 12:23:31 np0005533537.novalocal systemd[4299]: Startup finished in 152ms.
Nov 24 12:23:31 np0005533537.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 24 12:23:31 np0005533537.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 24 12:23:31 np0005533537.novalocal sshd-session[4295]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:23:31 np0005533537.novalocal python3[4381]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:23:35 np0005533537.novalocal python3[4409]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:23:39 np0005533537.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 12:23:41 np0005533537.novalocal python3[4469]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:23:42 np0005533537.novalocal python3[4509]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 24 12:23:44 np0005533537.novalocal python3[4535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDmoCx9Wa8x0gak4zidDMclmaW9kdpWrX8LhlBonSxppoA7g17pzZn7wMHMH6rLhuDiCfm5qGYMFRCRGoKDWXA4+dhojNQPDCFRM5tb9aQzB4a/orjeUt8uy3jiwwDHoT8w5lfonfFctvescLOaSi6klS/kaFTzSATU2dwEu0uDTTC7XUXNqRj1CQVf3Q10OOy+SQ0Fwgo1j83d0T+OmwwvvGhNYO7ldV9XrjVhUBGAYNjw3uweQef/mDNh498QTEkFDnIW3j/4al5uJRAouHfnT9KWqj7tyJwf+UjnoREXZbVK2+tJG/zAvLLy6GcwEoUb79zgH9yvZebvapXkxCWL/UJ5oKmCQGXkWkBgJ5r1vZVv/a58xHQWZ3Gievq7kwZPsTgG5+j/oiXOI3BbtLm+zB21bkljgj+9BSamEe1xmECu+SCoGaaSWG2jd9smQsfl+Dr1ZwFQfHwQxaCL3nRcDGAUw0SL4rgHeWIKb49NAGdHb5q4MKxaJpORtn3VbWE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:23:45 np0005533537.novalocal python3[4559]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:46 np0005533537.novalocal python3[4658]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:23:46 np0005533537.novalocal python3[4729]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763987025.6571205-230-184263502530092/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=f70cc30c21674685bb2e82d890b75d53_id_rsa follow=False checksum=3c0c67c7b118e97e4b6020fcca83b6321f7330e5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:46 np0005533537.novalocal python3[4852]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:23:47 np0005533537.novalocal python3[4923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763987026.647301-274-220781935684526/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=f70cc30c21674685bb2e82d890b75d53_id_rsa.pub follow=False checksum=b81552dbd483007e556c8d3114c1eb13b0166c85 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:48 np0005533537.novalocal python3[4971]: ansible-ping Invoked with data=pong
Nov 24 12:23:49 np0005533537.novalocal python3[4995]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:23:52 np0005533537.novalocal python3[5053]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 24 12:23:53 np0005533537.novalocal python3[5085]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:54 np0005533537.novalocal python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:54 np0005533537.novalocal python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:54 np0005533537.novalocal python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:55 np0005533537.novalocal python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:55 np0005533537.novalocal python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:56 np0005533537.novalocal sudo[5229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtnrytvpuxxrqqrvpzuijngpicfjgdul ; /usr/bin/python3'
Nov 24 12:23:56 np0005533537.novalocal sudo[5229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:23:56 np0005533537.novalocal python3[5231]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:56 np0005533537.novalocal sudo[5229]: pam_unix(sudo:session): session closed for user root
Nov 24 12:23:57 np0005533537.novalocal sudo[5307]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pepkxrjgayrhwksyluxxzyrfffiojyux ; /usr/bin/python3'
Nov 24 12:23:57 np0005533537.novalocal sudo[5307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:23:57 np0005533537.novalocal python3[5309]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:23:57 np0005533537.novalocal sudo[5307]: pam_unix(sudo:session): session closed for user root
Nov 24 12:23:57 np0005533537.novalocal sudo[5380]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxgkzutyohhjbzlzyfamhenodkeiziwr ; /usr/bin/python3'
Nov 24 12:23:57 np0005533537.novalocal sudo[5380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:23:57 np0005533537.novalocal python3[5382]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763987037.0503767-27-192623625012904/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:23:58 np0005533537.novalocal sudo[5380]: pam_unix(sudo:session): session closed for user root
Nov 24 12:23:58 np0005533537.novalocal python3[5430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:23:58 np0005533537.novalocal python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:23:59 np0005533537.novalocal python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:23:59 np0005533537.novalocal python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:23:59 np0005533537.novalocal python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:00 np0005533537.novalocal python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:00 np0005533537.novalocal python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:00 np0005533537.novalocal python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:01 np0005533537.novalocal python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:01 np0005533537.novalocal python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:01 np0005533537.novalocal python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:02 np0005533537.novalocal python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:02 np0005533537.novalocal python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:02 np0005533537.novalocal python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:02 np0005533537.novalocal python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:03 np0005533537.novalocal python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:03 np0005533537.novalocal python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:03 np0005533537.novalocal python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:04 np0005533537.novalocal python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:04 np0005533537.novalocal python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:04 np0005533537.novalocal python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:05 np0005533537.novalocal python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:05 np0005533537.novalocal python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:05 np0005533537.novalocal python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:05 np0005533537.novalocal python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:06 np0005533537.novalocal python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:24:08 np0005533537.novalocal sudo[6054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wieqpytnxnestxytvpksomcejinybjbu ; /usr/bin/python3'
Nov 24 12:24:08 np0005533537.novalocal sudo[6054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:24:09 np0005533537.novalocal python3[6056]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 24 12:24:09 np0005533537.novalocal systemd[1]: Starting Time & Date Service...
Nov 24 12:24:09 np0005533537.novalocal systemd[1]: Started Time & Date Service.
Nov 24 12:24:09 np0005533537.novalocal systemd-timedated[6058]: Changed time zone to 'UTC' (UTC).
Nov 24 12:24:09 np0005533537.novalocal sudo[6054]: pam_unix(sudo:session): session closed for user root
Nov 24 12:24:09 np0005533537.novalocal sudo[6085]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtmtroylfrstykgkklfvgipvhokmuibe ; /usr/bin/python3'
Nov 24 12:24:09 np0005533537.novalocal sudo[6085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:24:09 np0005533537.novalocal python3[6087]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:24:09 np0005533537.novalocal sudo[6085]: pam_unix(sudo:session): session closed for user root
Nov 24 12:24:10 np0005533537.novalocal python3[6163]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:24:10 np0005533537.novalocal python3[6234]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763987049.7980683-203-248557622500710/source _original_basename=tmpjdk_jlv5 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:24:10 np0005533537.novalocal python3[6334]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:24:11 np0005533537.novalocal python3[6405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763987050.635424-243-81646830142859/source _original_basename=tmpmkxs_9k3 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:24:11 np0005533537.novalocal sudo[6505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsekmgiwwoqndkfucmjktjwlzkprhobt ; /usr/bin/python3'
Nov 24 12:24:11 np0005533537.novalocal sudo[6505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:24:12 np0005533537.novalocal python3[6507]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:24:12 np0005533537.novalocal sudo[6505]: pam_unix(sudo:session): session closed for user root
Nov 24 12:24:12 np0005533537.novalocal sudo[6578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyysamqntqnmnvofdpgmxgckzyqxeucn ; /usr/bin/python3'
Nov 24 12:24:12 np0005533537.novalocal sudo[6578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:24:12 np0005533537.novalocal python3[6580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763987051.769955-307-189510574460579/source _original_basename=tmpaegklqfm follow=False checksum=2e7e63ba56c9b487ea71081bee61c12a1e9cb2fe backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:24:12 np0005533537.novalocal sudo[6578]: pam_unix(sudo:session): session closed for user root
Nov 24 12:24:12 np0005533537.novalocal python3[6628]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:24:13 np0005533537.novalocal python3[6654]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:24:13 np0005533537.novalocal sudo[6732]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kybdqwlofcyslrcthdhubhzalmgcinsw ; /usr/bin/python3'
Nov 24 12:24:13 np0005533537.novalocal sudo[6732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:24:13 np0005533537.novalocal python3[6734]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:24:13 np0005533537.novalocal sudo[6732]: pam_unix(sudo:session): session closed for user root
Nov 24 12:24:14 np0005533537.novalocal sudo[6805]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvczqdbsaletyqpwhjegquecrhgtynvm ; /usr/bin/python3'
Nov 24 12:24:14 np0005533537.novalocal sudo[6805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:24:14 np0005533537.novalocal python3[6807]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763987053.4049203-364-47231688556680/source _original_basename=tmp35tpbnp3 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:24:14 np0005533537.novalocal sudo[6805]: pam_unix(sudo:session): session closed for user root
Nov 24 12:24:14 np0005533537.novalocal sudo[6856]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swzykmvqhapohqhlefvkfyqqdapjvqpr ; /usr/bin/python3'
Nov 24 12:24:14 np0005533537.novalocal sudo[6856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:24:14 np0005533537.novalocal python3[6858]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-3025-b0f0-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:24:14 np0005533537.novalocal sudo[6856]: pam_unix(sudo:session): session closed for user root
Nov 24 12:24:15 np0005533537.novalocal python3[6886]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-3025-b0f0-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 24 12:24:16 np0005533537.novalocal python3[6914]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:24:37 np0005533537.novalocal sudo[6938]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzmrundvgmhrptdksqhyotjjunodccvh ; /usr/bin/python3'
Nov 24 12:24:37 np0005533537.novalocal sudo[6938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:24:37 np0005533537.novalocal python3[6940]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:24:37 np0005533537.novalocal sudo[6938]: pam_unix(sudo:session): session closed for user root
Nov 24 12:24:39 np0005533537.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 12:25:24 np0005533537.novalocal sshd-session[6943]: Connection closed by authenticating user root 185.156.73.233 port 23076 [preauth]
Nov 24 12:25:38 np0005533537.novalocal sshd-session[4308]: Received disconnect from 38.102.83.114 port 36212:11: disconnected by user
Nov 24 12:25:38 np0005533537.novalocal sshd-session[4308]: Disconnected from user zuul 38.102.83.114 port 36212
Nov 24 12:25:38 np0005533537.novalocal sshd-session[4295]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:25:38 np0005533537.novalocal systemd-logind[815]: Session 1 logged out. Waiting for processes to exit.
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 24 12:25:39 np0005533537.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 24 12:25:39 np0005533537.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1369] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 12:25:39 np0005533537.novalocal systemd-udevd[6945]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1611] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1648] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1655] device (eth1): carrier: link connected
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1659] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1668] policy: auto-activating connection 'Wired connection 1' (8bc818fe-af80-3cb7-9f9f-3a727bee475a)
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1674] device (eth1): Activation: starting connection 'Wired connection 1' (8bc818fe-af80-3cb7-9f9f-3a727bee475a)
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1675] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1681] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1688] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:25:39 np0005533537.novalocal NetworkManager[859]: <info>  [1763987139.1694] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:25:39 np0005533537.novalocal systemd[4299]: Starting Mark boot as successful...
Nov 24 12:25:39 np0005533537.novalocal systemd[4299]: Finished Mark boot as successful.
Nov 24 12:25:40 np0005533537.novalocal sshd-session[6950]: Accepted publickey for zuul from 38.102.83.114 port 42372 ssh2: RSA SHA256:Bm1xXVHt1OxsldPZFl0FA+3U35m71ho/RJnFYw8qFLw
Nov 24 12:25:40 np0005533537.novalocal systemd-logind[815]: New session 3 of user zuul.
Nov 24 12:25:40 np0005533537.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 24 12:25:40 np0005533537.novalocal sshd-session[6950]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:25:40 np0005533537.novalocal python3[6977]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-9775-766a-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:25:46 np0005533537.novalocal sshd-session[6980]: Invalid user jito from 193.32.162.145 port 52058
Nov 24 12:25:46 np0005533537.novalocal sshd-session[6980]: Connection closed by invalid user jito 193.32.162.145 port 52058 [preauth]
Nov 24 12:25:47 np0005533537.novalocal sudo[7057]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quuspmlqytcvpvregxkzwmnjywajueqw ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 12:25:47 np0005533537.novalocal sudo[7057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:25:47 np0005533537.novalocal python3[7059]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:25:47 np0005533537.novalocal sudo[7057]: pam_unix(sudo:session): session closed for user root
Nov 24 12:25:47 np0005533537.novalocal sudo[7130]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrbhhqkmmfynmkbtdkuvlrdmhlqaatca ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 12:25:47 np0005533537.novalocal sudo[7130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:25:47 np0005533537.novalocal python3[7132]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763987147.0119724-154-182330073514107/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=c56aedb62e4d6407435507be0104153f3cb6d544 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:25:47 np0005533537.novalocal sudo[7130]: pam_unix(sudo:session): session closed for user root
Nov 24 12:25:48 np0005533537.novalocal sudo[7180]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grbobdmytmyvklnxaxyezeqfghmsmvvt ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 12:25:48 np0005533537.novalocal sudo[7180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:25:48 np0005533537.novalocal python3[7182]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[859]: <info>  [1763987148.4051] caught SIGTERM, shutting down normally.
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Stopping Network Manager...
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[859]: <info>  [1763987148.4067] dhcp4 (eth0): canceled DHCP transaction
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[859]: <info>  [1763987148.4067] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[859]: <info>  [1763987148.4067] dhcp4 (eth0): state changed no lease
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[859]: <info>  [1763987148.4070] manager: NetworkManager state is now CONNECTING
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[859]: <info>  [1763987148.4249] dhcp4 (eth1): canceled DHCP transaction
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[859]: <info>  [1763987148.4249] dhcp4 (eth1): state changed no lease
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[859]: <info>  [1763987148.4307] exiting (success)
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Stopped Network Manager.
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: NetworkManager.service: Consumed 1.247s CPU time, 10.0M memory peak.
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Starting Network Manager...
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.4805] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:519b3143-8899-40d3-b573-09a79f21923a)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.4806] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.4862] manager[0x5581058e5070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Starting Hostname Service...
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Started Hostname Service.
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5704] hostname: hostname: using hostnamed
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5705] hostname: static hostname changed from (none) to "np0005533537.novalocal"
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5711] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5718] manager[0x5581058e5070]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5718] manager[0x5581058e5070]: rfkill: WWAN hardware radio set enabled
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5751] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5752] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5752] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5753] manager: Networking is enabled by state file
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5755] settings: Loaded settings plugin: keyfile (internal)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5761] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5787] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5796] dhcp: init: Using DHCP client 'internal'
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5799] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5804] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5809] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5817] device (lo): Activation: starting connection 'lo' (9d06024e-4e17-4e2e-898d-e229a91ed6b5)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5824] device (eth0): carrier: link connected
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5828] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5832] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5833] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5839] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5844] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5850] device (eth1): carrier: link connected
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5854] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5858] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (8bc818fe-af80-3cb7-9f9f-3a727bee475a) (indicated)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5859] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5864] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5871] device (eth1): Activation: starting connection 'Wired connection 1' (8bc818fe-af80-3cb7-9f9f-3a727bee475a)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5879] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Started Network Manager.
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5882] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5884] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5885] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5887] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5889] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5891] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5893] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5895] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5903] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5906] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5919] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5922] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5961] dhcp4 (eth0): state changed new lease, address=38.102.83.173
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.5966] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 12:25:48 np0005533537.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6026] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6034] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6036] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6041] device (lo): Activation: successful, device activated.
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6053] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6054] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6057] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6060] device (eth0): Activation: successful, device activated.
Nov 24 12:25:48 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987148.6064] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 12:25:48 np0005533537.novalocal sudo[7180]: pam_unix(sudo:session): session closed for user root
Nov 24 12:25:48 np0005533537.novalocal python3[7268]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-9775-766a-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:25:58 np0005533537.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 12:26:18 np0005533537.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.2871] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 12:26:34 np0005533537.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 12:26:34 np0005533537.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3142] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3145] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3154] device (eth1): Activation: successful, device activated.
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3161] manager: startup complete
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3164] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <warn>  [1763987194.3169] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3177] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 24 12:26:34 np0005533537.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3289] dhcp4 (eth1): canceled DHCP transaction
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3290] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3290] dhcp4 (eth1): state changed no lease
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3314] policy: auto-activating connection 'ci-private-network' (6d0d6144-7d1b-5062-bb60-a38a6f16cac3)
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3323] device (eth1): Activation: starting connection 'ci-private-network' (6d0d6144-7d1b-5062-bb60-a38a6f16cac3)
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3325] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3331] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3347] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.3365] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.4787] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.4792] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:26:34 np0005533537.novalocal NetworkManager[7191]: <info>  [1763987194.4802] device (eth1): Activation: successful, device activated.
Nov 24 12:26:44 np0005533537.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 12:26:49 np0005533537.novalocal sshd-session[6953]: Received disconnect from 38.102.83.114 port 42372:11: disconnected by user
Nov 24 12:26:49 np0005533537.novalocal sshd-session[6953]: Disconnected from user zuul 38.102.83.114 port 42372
Nov 24 12:26:49 np0005533537.novalocal sshd-session[6950]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:26:49 np0005533537.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 24 12:26:49 np0005533537.novalocal systemd[1]: session-3.scope: Consumed 1.785s CPU time.
Nov 24 12:26:49 np0005533537.novalocal systemd-logind[815]: Session 3 logged out. Waiting for processes to exit.
Nov 24 12:26:49 np0005533537.novalocal systemd-logind[815]: Removed session 3.
Nov 24 12:26:51 np0005533537.novalocal sshd-session[7298]: Accepted publickey for zuul from 38.102.83.114 port 48126 ssh2: RSA SHA256:Bm1xXVHt1OxsldPZFl0FA+3U35m71ho/RJnFYw8qFLw
Nov 24 12:26:51 np0005533537.novalocal systemd-logind[815]: New session 4 of user zuul.
Nov 24 12:26:51 np0005533537.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 24 12:26:51 np0005533537.novalocal sshd-session[7298]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:26:51 np0005533537.novalocal sudo[7377]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbjgvsyyaacdrffbskpaaupojtezzjwf ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 12:26:51 np0005533537.novalocal sudo[7377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:26:51 np0005533537.novalocal python3[7379]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:26:51 np0005533537.novalocal sudo[7377]: pam_unix(sudo:session): session closed for user root
Nov 24 12:26:52 np0005533537.novalocal sudo[7450]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqnxoblctiojttxwcmupqocwtzpyglet ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 12:26:52 np0005533537.novalocal sudo[7450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:26:52 np0005533537.novalocal python3[7452]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763987211.627312-312-228308851748862/source _original_basename=tmpizdrac9s follow=False checksum=e7ab508e8231a4e0337fc940fb5635221bdca268 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:26:52 np0005533537.novalocal sudo[7450]: pam_unix(sudo:session): session closed for user root
Nov 24 12:26:54 np0005533537.novalocal sshd-session[7301]: Connection closed by 38.102.83.114 port 48126
Nov 24 12:26:54 np0005533537.novalocal sshd-session[7298]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:26:54 np0005533537.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 24 12:26:54 np0005533537.novalocal systemd-logind[815]: Session 4 logged out. Waiting for processes to exit.
Nov 24 12:26:54 np0005533537.novalocal systemd-logind[815]: Removed session 4.
Nov 24 12:28:40 np0005533537.novalocal systemd[4299]: Created slice User Background Tasks Slice.
Nov 24 12:28:40 np0005533537.novalocal systemd[4299]: Starting Cleanup of User's Temporary Files and Directories...
Nov 24 12:28:40 np0005533537.novalocal systemd[4299]: Finished Cleanup of User's Temporary Files and Directories.
Nov 24 12:29:01 np0005533537.novalocal sshd-session[7480]: Invalid user shredstream from 193.32.162.145 port 37358
Nov 24 12:29:01 np0005533537.novalocal sshd-session[7480]: Connection closed by invalid user shredstream 193.32.162.145 port 37358 [preauth]
Nov 24 12:29:45 np0005533537.novalocal chronyd[790]: Selected source 174.142.148.226 (2.centos.pool.ntp.org)
Nov 24 12:31:08 np0005533537.novalocal sshd-session[7483]: Accepted publickey for zuul from 38.102.83.114 port 49970 ssh2: RSA SHA256:Bm1xXVHt1OxsldPZFl0FA+3U35m71ho/RJnFYw8qFLw
Nov 24 12:31:08 np0005533537.novalocal systemd-logind[815]: New session 5 of user zuul.
Nov 24 12:31:08 np0005533537.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 24 12:31:08 np0005533537.novalocal sshd-session[7483]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:31:09 np0005533537.novalocal sudo[7510]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzlivpopbswqngjslkpgpopbkkkxxgfa ; /usr/bin/python3'
Nov 24 12:31:09 np0005533537.novalocal sudo[7510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:09 np0005533537.novalocal python3[7512]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-38d0-9c76-000000001cc9-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:31:09 np0005533537.novalocal sudo[7510]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:09 np0005533537.novalocal sudo[7538]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvgedfhotmybuswkjmsaubjhocxrbjwc ; /usr/bin/python3'
Nov 24 12:31:09 np0005533537.novalocal sudo[7538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:09 np0005533537.novalocal python3[7540]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:31:09 np0005533537.novalocal sudo[7538]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:09 np0005533537.novalocal sudo[7565]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juzvxhmlittdhlhadxuzljsrdkspstpa ; /usr/bin/python3'
Nov 24 12:31:09 np0005533537.novalocal sudo[7565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:09 np0005533537.novalocal python3[7567]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:31:09 np0005533537.novalocal sudo[7565]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:10 np0005533537.novalocal sudo[7591]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svnuhtnvbomtqsmtsvxetwmnttmhekpk ; /usr/bin/python3'
Nov 24 12:31:10 np0005533537.novalocal sudo[7591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:10 np0005533537.novalocal python3[7593]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:31:10 np0005533537.novalocal sudo[7591]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:10 np0005533537.novalocal sudo[7617]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lictnvtvlzphvnzkrzbemsntwqkwswor ; /usr/bin/python3'
Nov 24 12:31:10 np0005533537.novalocal sudo[7617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:10 np0005533537.novalocal python3[7619]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:31:10 np0005533537.novalocal sudo[7617]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:10 np0005533537.novalocal sudo[7643]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmkyfivsscweangxsufzlxdtevkpbfwh ; /usr/bin/python3'
Nov 24 12:31:10 np0005533537.novalocal sudo[7643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:11 np0005533537.novalocal python3[7645]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:31:11 np0005533537.novalocal sudo[7643]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:11 np0005533537.novalocal sudo[7721]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emwbippgxcyzjyxqlexuwykarimumicl ; /usr/bin/python3'
Nov 24 12:31:11 np0005533537.novalocal sudo[7721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:11 np0005533537.novalocal python3[7723]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:31:11 np0005533537.novalocal sudo[7721]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:11 np0005533537.novalocal sudo[7794]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drqfphsrnqcmvdztjvjkocfdvqvbpfdc ; /usr/bin/python3'
Nov 24 12:31:11 np0005533537.novalocal sudo[7794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:11 np0005533537.novalocal python3[7796]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763987471.2661765-487-245268026873948/source _original_basename=tmpo758mh0v follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:31:11 np0005533537.novalocal sudo[7794]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:12 np0005533537.novalocal sudo[7844]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbykrblfoqivktnuyisnppjjpdedrari ; /usr/bin/python3'
Nov 24 12:31:12 np0005533537.novalocal sudo[7844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:12 np0005533537.novalocal python3[7846]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 12:31:12 np0005533537.novalocal systemd[1]: Reloading.
Nov 24 12:31:13 np0005533537.novalocal systemd-rc-local-generator[7869]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:31:13 np0005533537.novalocal sudo[7844]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:14 np0005533537.novalocal sudo[7901]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebeshmzbolhbfvuyksqemjqrkuiqnpag ; /usr/bin/python3'
Nov 24 12:31:14 np0005533537.novalocal sudo[7901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:14 np0005533537.novalocal python3[7903]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 24 12:31:14 np0005533537.novalocal sudo[7901]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:15 np0005533537.novalocal sudo[7927]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elreghycgcsshtkegljmjtmcvcegtzfa ; /usr/bin/python3'
Nov 24 12:31:15 np0005533537.novalocal sudo[7927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:15 np0005533537.novalocal python3[7929]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:31:15 np0005533537.novalocal sudo[7927]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:15 np0005533537.novalocal sudo[7955]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaiwbxktbqxzaijzfhyiblxqpuypoyqf ; /usr/bin/python3'
Nov 24 12:31:15 np0005533537.novalocal sudo[7955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:15 np0005533537.novalocal python3[7957]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:31:15 np0005533537.novalocal sudo[7955]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:15 np0005533537.novalocal sudo[7983]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llvfpwdtljzngrenrdhaxifuhbifpzoc ; /usr/bin/python3'
Nov 24 12:31:15 np0005533537.novalocal sudo[7983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:15 np0005533537.novalocal python3[7985]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:31:15 np0005533537.novalocal sudo[7983]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:15 np0005533537.novalocal sudo[8011]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjnmqjhreiwjtqjcyezmbmtzjasbbxia ; /usr/bin/python3'
Nov 24 12:31:15 np0005533537.novalocal sudo[8011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:16 np0005533537.novalocal python3[8013]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:31:16 np0005533537.novalocal sudo[8011]: pam_unix(sudo:session): session closed for user root
Nov 24 12:31:16 np0005533537.novalocal python3[8040]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-38d0-9c76-000000001cd0-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:31:17 np0005533537.novalocal python3[8070]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 24 12:31:19 np0005533537.novalocal sshd-session[7486]: Connection closed by 38.102.83.114 port 49970
Nov 24 12:31:19 np0005533537.novalocal sshd-session[7483]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:31:19 np0005533537.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 24 12:31:19 np0005533537.novalocal systemd[1]: session-5.scope: Consumed 4.530s CPU time.
Nov 24 12:31:19 np0005533537.novalocal systemd-logind[815]: Session 5 logged out. Waiting for processes to exit.
Nov 24 12:31:19 np0005533537.novalocal systemd-logind[815]: Removed session 5.
Nov 24 12:31:21 np0005533537.novalocal sshd-session[8074]: Accepted publickey for zuul from 38.102.83.114 port 34286 ssh2: RSA SHA256:Bm1xXVHt1OxsldPZFl0FA+3U35m71ho/RJnFYw8qFLw
Nov 24 12:31:21 np0005533537.novalocal systemd-logind[815]: New session 6 of user zuul.
Nov 24 12:31:21 np0005533537.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 24 12:31:21 np0005533537.novalocal sshd-session[8074]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:31:21 np0005533537.novalocal sudo[8101]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeqzqcenfasatjjlrcjbywkrdnhcwrry ; /usr/bin/python3'
Nov 24 12:31:21 np0005533537.novalocal sudo[8101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:31:21 np0005533537.novalocal python3[8103]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 24 12:31:38 np0005533537.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 12:31:38 np0005533537.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 12:31:38 np0005533537.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 12:31:38 np0005533537.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 12:31:38 np0005533537.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 12:31:38 np0005533537.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 12:31:38 np0005533537.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 12:31:38 np0005533537.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 12:31:47 np0005533537.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 12:31:47 np0005533537.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 12:31:47 np0005533537.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 12:31:47 np0005533537.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 12:31:47 np0005533537.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 12:31:47 np0005533537.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 12:31:47 np0005533537.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 12:31:47 np0005533537.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 12:31:56 np0005533537.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 12:31:56 np0005533537.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 12:31:56 np0005533537.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 12:31:56 np0005533537.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 12:31:56 np0005533537.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 12:31:56 np0005533537.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 12:31:56 np0005533537.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 12:31:56 np0005533537.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 12:31:58 np0005533537.novalocal setsebool[8171]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 24 12:31:58 np0005533537.novalocal setsebool[8171]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 24 12:32:01 np0005533537.novalocal sshd-session[8180]: Connection closed by 45.148.10.240 port 38798
Nov 24 12:32:06 np0005533537.novalocal sshd-session[8181]: Invalid user bot from 193.32.162.145 port 50876
Nov 24 12:32:06 np0005533537.novalocal sshd-session[8181]: Connection closed by invalid user bot 193.32.162.145 port 50876 [preauth]
Nov 24 12:32:10 np0005533537.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 24 12:32:10 np0005533537.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 12:32:10 np0005533537.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 12:32:10 np0005533537.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 12:32:10 np0005533537.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 12:32:10 np0005533537.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 12:32:10 np0005533537.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 12:32:10 np0005533537.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 12:32:29 np0005533537.novalocal dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 24 12:32:29 np0005533537.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 12:32:29 np0005533537.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 24 12:32:30 np0005533537.novalocal systemd[1]: Reloading.
Nov 24 12:32:30 np0005533537.novalocal systemd-rc-local-generator[8933]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:32:30 np0005533537.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 12:32:31 np0005533537.novalocal sudo[8101]: pam_unix(sudo:session): session closed for user root
Nov 24 12:32:32 np0005533537.novalocal python3[10984]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-081f-8a6f-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:32:33 np0005533537.novalocal kernel: evm: overlay not supported
Nov 24 12:32:34 np0005533537.novalocal systemd[4299]: Starting D-Bus User Message Bus...
Nov 24 12:32:34 np0005533537.novalocal dbus-broker-launch[12195]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 24 12:32:34 np0005533537.novalocal dbus-broker-launch[12195]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 24 12:32:34 np0005533537.novalocal systemd[4299]: Started D-Bus User Message Bus.
Nov 24 12:32:34 np0005533537.novalocal dbus-broker-lau[12195]: Ready
Nov 24 12:32:34 np0005533537.novalocal systemd[4299]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 24 12:32:34 np0005533537.novalocal systemd[4299]: Created slice Slice /user.
Nov 24 12:32:34 np0005533537.novalocal systemd[4299]: podman-11918.scope: unit configures an IP firewall, but not running as root.
Nov 24 12:32:34 np0005533537.novalocal systemd[4299]: (This warning is only shown for the first unit using IP firewalling.)
Nov 24 12:32:34 np0005533537.novalocal systemd[4299]: Started podman-11918.scope.
Nov 24 12:32:34 np0005533537.novalocal systemd[4299]: Started podman-pause-796fc9a7.scope.
Nov 24 12:32:34 np0005533537.novalocal sudo[12662]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oylbnukpmzdfycakksygwxwuanffafuu ; /usr/bin/python3'
Nov 24 12:32:34 np0005533537.novalocal sudo[12662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:32:34 np0005533537.novalocal python3[12674]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.217:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.217:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:32:34 np0005533537.novalocal python3[12674]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 24 12:32:34 np0005533537.novalocal sudo[12662]: pam_unix(sudo:session): session closed for user root
Nov 24 12:32:35 np0005533537.novalocal sshd-session[8077]: Connection closed by 38.102.83.114 port 34286
Nov 24 12:32:35 np0005533537.novalocal sshd-session[8074]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:32:35 np0005533537.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Nov 24 12:32:35 np0005533537.novalocal systemd[1]: session-6.scope: Consumed 59.239s CPU time.
Nov 24 12:32:35 np0005533537.novalocal systemd-logind[815]: Session 6 logged out. Waiting for processes to exit.
Nov 24 12:32:35 np0005533537.novalocal systemd-logind[815]: Removed session 6.
Nov 24 12:32:54 np0005533537.novalocal sshd-session[19382]: Unable to negotiate with 38.102.83.154 port 39786: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 24 12:32:54 np0005533537.novalocal sshd-session[19383]: Connection closed by 38.102.83.154 port 39766 [preauth]
Nov 24 12:32:54 np0005533537.novalocal sshd-session[19386]: Connection closed by 38.102.83.154 port 39770 [preauth]
Nov 24 12:32:54 np0005533537.novalocal sshd-session[19381]: Unable to negotiate with 38.102.83.154 port 39782: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 24 12:32:54 np0005533537.novalocal sshd-session[19388]: Unable to negotiate with 38.102.83.154 port 39802: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 24 12:32:58 np0005533537.novalocal sshd-session[20783]: Accepted publickey for zuul from 38.102.83.114 port 46122 ssh2: RSA SHA256:Bm1xXVHt1OxsldPZFl0FA+3U35m71ho/RJnFYw8qFLw
Nov 24 12:32:58 np0005533537.novalocal systemd-logind[815]: New session 7 of user zuul.
Nov 24 12:32:58 np0005533537.novalocal systemd[1]: Started Session 7 of User zuul.
Nov 24 12:32:58 np0005533537.novalocal sshd-session[20783]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:32:59 np0005533537.novalocal python3[20871]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOBm/EcM/5ZvoHqxkZYgBhHm+r+pqtHpKsRXEmh6+Du3ioUXwMOOkD0GQHNZJ9j41G77D+Wzjn5iYYPyZPxNU48= zuul@np0005533535.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:32:59 np0005533537.novalocal sudo[21031]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojtzyhoicmewsrfrehuhsezgenpoayax ; /usr/bin/python3'
Nov 24 12:32:59 np0005533537.novalocal sudo[21031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:32:59 np0005533537.novalocal python3[21045]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOBm/EcM/5ZvoHqxkZYgBhHm+r+pqtHpKsRXEmh6+Du3ioUXwMOOkD0GQHNZJ9j41G77D+Wzjn5iYYPyZPxNU48= zuul@np0005533535.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:32:59 np0005533537.novalocal sudo[21031]: pam_unix(sudo:session): session closed for user root
Nov 24 12:33:00 np0005533537.novalocal sudo[21403]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zctemrkfjqrbhfxjefraprjqekonlzvj ; /usr/bin/python3'
Nov 24 12:33:00 np0005533537.novalocal sudo[21403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:33:00 np0005533537.novalocal python3[21412]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005533537.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 24 12:33:00 np0005533537.novalocal useradd[21482]: new group: name=cloud-admin, GID=1002
Nov 24 12:33:00 np0005533537.novalocal useradd[21482]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 24 12:33:01 np0005533537.novalocal sudo[21403]: pam_unix(sudo:session): session closed for user root
Nov 24 12:33:01 np0005533537.novalocal sudo[21856]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oppuaalpfalojpqjbcjktqjktweykyro ; /usr/bin/python3'
Nov 24 12:33:01 np0005533537.novalocal sudo[21856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:33:01 np0005533537.novalocal python3[21862]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOBm/EcM/5ZvoHqxkZYgBhHm+r+pqtHpKsRXEmh6+Du3ioUXwMOOkD0GQHNZJ9j41G77D+Wzjn5iYYPyZPxNU48= zuul@np0005533535.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 12:33:01 np0005533537.novalocal sudo[21856]: pam_unix(sudo:session): session closed for user root
Nov 24 12:33:02 np0005533537.novalocal sudo[22071]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkeidbhklhewxnbdxtgxpahokasxxbog ; /usr/bin/python3'
Nov 24 12:33:02 np0005533537.novalocal sudo[22071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:33:02 np0005533537.novalocal python3[22078]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:33:02 np0005533537.novalocal sudo[22071]: pam_unix(sudo:session): session closed for user root
Nov 24 12:33:02 np0005533537.novalocal sudo[22280]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjqqbugrceylofbvjxyktkcjthxxyvhp ; /usr/bin/python3'
Nov 24 12:33:02 np0005533537.novalocal sudo[22280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:33:02 np0005533537.novalocal python3[22287]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763987581.9530628-152-133356232601779/source _original_basename=tmpym__fgkg follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:33:02 np0005533537.novalocal sudo[22280]: pam_unix(sudo:session): session closed for user root
Nov 24 12:33:03 np0005533537.novalocal sudo[22616]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szzfqfflaoeiacpdexawsmwulwwczggp ; /usr/bin/python3'
Nov 24 12:33:03 np0005533537.novalocal sudo[22616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:33:03 np0005533537.novalocal python3[22626]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Nov 24 12:33:03 np0005533537.novalocal systemd[1]: Starting Hostname Service...
Nov 24 12:33:03 np0005533537.novalocal systemd[1]: Started Hostname Service.
Nov 24 12:33:03 np0005533537.novalocal systemd-hostnamed[22692]: Changed pretty hostname to 'compute-1'
Nov 24 12:33:03 compute-1 systemd-hostnamed[22692]: Hostname set to <compute-1> (static)
Nov 24 12:33:03 compute-1 NetworkManager[7191]: <info>  [1763987583.8535] hostname: static hostname changed from "np0005533537.novalocal" to "compute-1"
Nov 24 12:33:03 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 12:33:03 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 12:33:03 compute-1 sudo[22616]: pam_unix(sudo:session): session closed for user root
Nov 24 12:33:04 compute-1 sshd-session[20821]: Connection closed by 38.102.83.114 port 46122
Nov 24 12:33:04 compute-1 sshd-session[20783]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:33:04 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Nov 24 12:33:04 compute-1 systemd[1]: session-7.scope: Consumed 2.460s CPU time.
Nov 24 12:33:04 compute-1 systemd-logind[815]: Session 7 logged out. Waiting for processes to exit.
Nov 24 12:33:04 compute-1 systemd-logind[815]: Removed session 7.
Nov 24 12:33:13 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 12:33:29 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 12:33:29 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 24 12:33:29 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1min 6.515s CPU time.
Nov 24 12:33:29 compute-1 systemd[1]: run-r0ab3c2492d884469922e66e216526490.service: Deactivated successfully.
Nov 24 12:33:33 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 12:34:04 compute-1 sshd-session[29932]: Invalid user sol from 45.148.10.240 port 49988
Nov 24 12:34:05 compute-1 sshd-session[29932]: Connection closed by invalid user sol 45.148.10.240 port 49988 [preauth]
Nov 24 12:35:22 compute-1 sshd-session[29937]: Invalid user trader from 193.32.162.145 port 36168
Nov 24 12:35:22 compute-1 sshd-session[29937]: Connection closed by invalid user trader 193.32.162.145 port 36168 [preauth]
Nov 24 12:35:59 compute-1 sshd-session[29939]: Invalid user solana from 45.148.10.240 port 55420
Nov 24 12:35:59 compute-1 sshd-session[29939]: Connection closed by invalid user solana 45.148.10.240 port 55420 [preauth]
Nov 24 12:36:16 compute-1 sshd-session[29941]: Connection closed by authenticating user root 185.156.73.233 port 39724 [preauth]
Nov 24 12:36:50 compute-1 sshd-session[29943]: Accepted publickey for zuul from 38.102.83.154 port 33820 ssh2: RSA SHA256:Bm1xXVHt1OxsldPZFl0FA+3U35m71ho/RJnFYw8qFLw
Nov 24 12:36:50 compute-1 systemd-logind[815]: New session 8 of user zuul.
Nov 24 12:36:50 compute-1 systemd[1]: Started Session 8 of User zuul.
Nov 24 12:36:50 compute-1 sshd-session[29943]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:36:50 compute-1 python3[30019]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:36:53 compute-1 sudo[30133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdiqtzzdhbsqwropkfvingmtahymvhym ; /usr/bin/python3'
Nov 24 12:36:53 compute-1 sudo[30133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:53 compute-1 python3[30135]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:36:53 compute-1 sudo[30133]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:54 compute-1 sudo[30206]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npqjqmujynuiaxkaxrcuqsofsufqncmg ; /usr/bin/python3'
Nov 24 12:36:54 compute-1 sudo[30206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:54 compute-1 python3[30208]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763987813.4543037-33751-51354912769529/source mode=0755 _original_basename=delorean.repo follow=False checksum=1830be8248976a7f714fb01ca8550e92dfc79ad2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:36:54 compute-1 sudo[30206]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:54 compute-1 sudo[30232]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncoyhnyclfnizjvqotskbekukyznrdby ; /usr/bin/python3'
Nov 24 12:36:54 compute-1 sudo[30232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:54 compute-1 python3[30234]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:36:54 compute-1 sudo[30232]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:54 compute-1 sudo[30305]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmkgdtdgmkzkspsnqrlawkuooauzblcv ; /usr/bin/python3'
Nov 24 12:36:54 compute-1 sudo[30305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:54 compute-1 python3[30307]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763987813.4543037-33751-51354912769529/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:36:54 compute-1 sudo[30305]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:54 compute-1 sudo[30331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpiesssjodpludfsprqpeilrqcohuree ; /usr/bin/python3'
Nov 24 12:36:54 compute-1 sudo[30331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:55 compute-1 python3[30333]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:36:55 compute-1 sudo[30331]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:55 compute-1 sudo[30404]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkuvanqnnyhcmqgpjdwqecoywgqytqnn ; /usr/bin/python3'
Nov 24 12:36:55 compute-1 sudo[30404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:55 compute-1 python3[30406]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763987813.4543037-33751-51354912769529/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:36:55 compute-1 sudo[30404]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:55 compute-1 sudo[30430]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dntchsnhqhpkycywstisvwebwgivxcww ; /usr/bin/python3'
Nov 24 12:36:55 compute-1 sudo[30430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:55 compute-1 python3[30432]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:36:55 compute-1 sudo[30430]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:55 compute-1 sudo[30503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkpliwptovhahvlpqzvmboirlwdalasg ; /usr/bin/python3'
Nov 24 12:36:55 compute-1 sudo[30503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:56 compute-1 python3[30505]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763987813.4543037-33751-51354912769529/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:36:56 compute-1 sudo[30503]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:56 compute-1 sudo[30529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glvtyimtvrtluxvxncugfzpazbuaqiqv ; /usr/bin/python3'
Nov 24 12:36:56 compute-1 sudo[30529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:56 compute-1 python3[30531]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:36:56 compute-1 sudo[30529]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:56 compute-1 sudo[30602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldsjngxdyiudbvycyysxpbwxapzewldc ; /usr/bin/python3'
Nov 24 12:36:56 compute-1 sudo[30602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:56 compute-1 python3[30604]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763987813.4543037-33751-51354912769529/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:36:56 compute-1 sudo[30602]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:56 compute-1 sudo[30628]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhfcxtupmzbcdmygtyaplkkrbxppbyxt ; /usr/bin/python3'
Nov 24 12:36:56 compute-1 sudo[30628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:56 compute-1 python3[30630]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:36:56 compute-1 sudo[30628]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:57 compute-1 sudo[30701]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xenenupfybwiikbqcklcxyfkxxapnlea ; /usr/bin/python3'
Nov 24 12:36:57 compute-1 sudo[30701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:57 compute-1 python3[30703]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763987813.4543037-33751-51354912769529/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:36:57 compute-1 sudo[30701]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:57 compute-1 sudo[30727]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raykedxfrzgnohmcphrpnezpkwdyvhpr ; /usr/bin/python3'
Nov 24 12:36:57 compute-1 sudo[30727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:57 compute-1 python3[30729]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 12:36:57 compute-1 sudo[30727]: pam_unix(sudo:session): session closed for user root
Nov 24 12:36:57 compute-1 sudo[30800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgnmtyekptupyvzrlwrbexqfqjzxxhrn ; /usr/bin/python3'
Nov 24 12:36:57 compute-1 sudo[30800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:36:57 compute-1 python3[30802]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763987813.4543037-33751-51354912769529/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:36:57 compute-1 sudo[30800]: pam_unix(sudo:session): session closed for user root
Nov 24 12:37:48 compute-1 sshd-session[30827]: Invalid user sol from 45.148.10.240 port 60896
Nov 24 12:37:48 compute-1 sshd-session[30827]: Connection closed by invalid user sol 45.148.10.240 port 60896 [preauth]
Nov 24 12:37:53 compute-1 python3[30852]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:38:36 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 24 12:38:36 compute-1 sshd-session[30856]: Invalid user trading from 193.32.162.145 port 49688
Nov 24 12:38:36 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 24 12:38:36 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 24 12:38:36 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 24 12:38:36 compute-1 sshd-session[30856]: Connection closed by invalid user trading 193.32.162.145 port 49688 [preauth]
Nov 24 12:39:38 compute-1 sshd-session[30861]: Invalid user ubuntu from 45.148.10.240 port 35370
Nov 24 12:39:38 compute-1 sshd-session[30861]: Connection closed by invalid user ubuntu 45.148.10.240 port 35370 [preauth]
Nov 24 12:40:23 compute-1 sshd-session[30864]: Received disconnect from 193.46.255.99 port 46960:11:  [preauth]
Nov 24 12:40:23 compute-1 sshd-session[30864]: Disconnected from authenticating user root 193.46.255.99 port 46960 [preauth]
Nov 24 12:40:28 compute-1 sshd-session[30866]: Connection closed by 62.87.151.183 port 39746
Nov 24 12:41:32 compute-1 sshd-session[30867]: Invalid user ubuntu from 45.148.10.240 port 39260
Nov 24 12:41:32 compute-1 sshd-session[30867]: Connection closed by invalid user ubuntu 45.148.10.240 port 39260 [preauth]
Nov 24 12:41:56 compute-1 sshd-session[30869]: Invalid user ubuntu from 193.32.162.145 port 34980
Nov 24 12:41:56 compute-1 sshd-session[30869]: Connection closed by invalid user ubuntu 193.32.162.145 port 34980 [preauth]
Nov 24 12:42:53 compute-1 sshd-session[29946]: Received disconnect from 38.102.83.154 port 33820:11: disconnected by user
Nov 24 12:42:53 compute-1 sshd-session[29946]: Disconnected from user zuul 38.102.83.154 port 33820
Nov 24 12:42:53 compute-1 sshd-session[29943]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:42:53 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Nov 24 12:42:53 compute-1 systemd[1]: session-8.scope: Consumed 5.181s CPU time.
Nov 24 12:42:53 compute-1 systemd-logind[815]: Session 8 logged out. Waiting for processes to exit.
Nov 24 12:42:53 compute-1 systemd-logind[815]: Removed session 8.
Nov 24 12:43:21 compute-1 sshd-session[30871]: Invalid user sol from 45.148.10.240 port 41088
Nov 24 12:43:21 compute-1 sshd-session[30871]: Connection closed by invalid user sol 45.148.10.240 port 41088 [preauth]
Nov 24 12:45:06 compute-1 sshd-session[30875]: Invalid user geyser from 193.32.162.145 port 48532
Nov 24 12:45:06 compute-1 sshd-session[30875]: Connection closed by invalid user geyser 193.32.162.145 port 48532 [preauth]
Nov 24 12:45:10 compute-1 sshd-session[30877]: Invalid user solana from 45.148.10.240 port 45314
Nov 24 12:45:10 compute-1 sshd-session[30877]: Connection closed by invalid user solana 45.148.10.240 port 45314 [preauth]
Nov 24 12:47:09 compute-1 sshd-session[30881]: Connection closed by authenticating user root 185.156.73.233 port 55230 [preauth]
Nov 24 12:47:10 compute-1 sshd-session[30883]: Invalid user solana from 45.148.10.240 port 45016
Nov 24 12:47:10 compute-1 sshd-session[30883]: Connection closed by invalid user solana 45.148.10.240 port 45016 [preauth]
Nov 24 12:47:11 compute-1 sshd-session[30885]: Connection closed by 61.240.213.113 port 52940
Nov 24 12:48:19 compute-1 sshd-session[30886]: Invalid user solana from 193.32.162.145 port 33814
Nov 24 12:48:20 compute-1 sshd-session[30886]: Connection closed by invalid user solana 193.32.162.145 port 33814 [preauth]
Nov 24 12:49:10 compute-1 sshd-session[30888]: Invalid user sol from 45.148.10.240 port 36300
Nov 24 12:49:10 compute-1 sshd-session[30888]: Connection closed by invalid user sol 45.148.10.240 port 36300 [preauth]
Nov 24 12:49:43 compute-1 sshd-session[30890]: Accepted publickey for zuul from 192.168.122.30 port 59518 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:49:43 compute-1 systemd-logind[815]: New session 9 of user zuul.
Nov 24 12:49:43 compute-1 systemd[1]: Started Session 9 of User zuul.
Nov 24 12:49:43 compute-1 sshd-session[30890]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:49:44 compute-1 python3.9[31043]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:49:45 compute-1 sudo[31222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcxqnkyhchynkwfcjpztsgppmroqeunp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988585.2689886-45-116345658357531/AnsiballZ_command.py'
Nov 24 12:49:45 compute-1 sudo[31222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:49:45 compute-1 python3.9[31224]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:49:54 compute-1 sudo[31222]: pam_unix(sudo:session): session closed for user root
Nov 24 12:49:56 compute-1 sshd-session[30893]: Connection closed by 192.168.122.30 port 59518
Nov 24 12:49:56 compute-1 sshd-session[30890]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:49:56 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Nov 24 12:49:56 compute-1 systemd[1]: session-9.scope: Consumed 7.839s CPU time.
Nov 24 12:49:56 compute-1 systemd-logind[815]: Session 9 logged out. Waiting for processes to exit.
Nov 24 12:49:56 compute-1 systemd-logind[815]: Removed session 9.
Nov 24 12:50:01 compute-1 sshd-session[31281]: Accepted publickey for zuul from 192.168.122.30 port 38516 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:50:01 compute-1 systemd-logind[815]: New session 10 of user zuul.
Nov 24 12:50:01 compute-1 systemd[1]: Started Session 10 of User zuul.
Nov 24 12:50:01 compute-1 sshd-session[31281]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:50:02 compute-1 python3.9[31434]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:50:03 compute-1 sshd-session[31284]: Connection closed by 192.168.122.30 port 38516
Nov 24 12:50:03 compute-1 sshd-session[31281]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:50:03 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Nov 24 12:50:03 compute-1 systemd-logind[815]: Session 10 logged out. Waiting for processes to exit.
Nov 24 12:50:03 compute-1 systemd-logind[815]: Removed session 10.
Nov 24 12:50:19 compute-1 sshd-session[31462]: Accepted publickey for zuul from 192.168.122.30 port 42704 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:50:19 compute-1 systemd-logind[815]: New session 11 of user zuul.
Nov 24 12:50:19 compute-1 systemd[1]: Started Session 11 of User zuul.
Nov 24 12:50:19 compute-1 sshd-session[31462]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:50:19 compute-1 python3.9[31615]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 24 12:50:21 compute-1 python3.9[31789]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:50:21 compute-1 sudo[31939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjczwojxqgobnbbqnjumfgqpuzqjkwcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988621.4645696-70-173519787567268/AnsiballZ_command.py'
Nov 24 12:50:21 compute-1 sudo[31939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:22 compute-1 python3.9[31941]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:50:22 compute-1 sudo[31939]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:22 compute-1 sudo[32092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plociaxcrgdewjjfmuoiaqrnqpejjrek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988622.4602532-94-158369381742734/AnsiballZ_stat.py'
Nov 24 12:50:22 compute-1 sudo[32092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:23 compute-1 python3.9[32094]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:50:23 compute-1 sudo[32092]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:23 compute-1 sudo[32244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxordchqkmxhjcgxofmynlebzyrrbkly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988623.2644162-110-83173717408935/AnsiballZ_file.py'
Nov 24 12:50:23 compute-1 sudo[32244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:23 compute-1 python3.9[32246]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:50:23 compute-1 sudo[32244]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:24 compute-1 sudo[32396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpmqxozwbdkcxlrbwpxzsrbotcvwkhra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988624.0859585-126-105668987163714/AnsiballZ_stat.py'
Nov 24 12:50:24 compute-1 sudo[32396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:24 compute-1 python3.9[32398]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:50:24 compute-1 sudo[32396]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:25 compute-1 sudo[32519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrhxkcurrsdcgsjziablqgxyepqmcvqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988624.0859585-126-105668987163714/AnsiballZ_copy.py'
Nov 24 12:50:25 compute-1 sudo[32519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:25 compute-1 python3.9[32521]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763988624.0859585-126-105668987163714/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:50:25 compute-1 sudo[32519]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:25 compute-1 sudo[32671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utnsmrtkiigazbbhzeqojnzjlcmzorfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988625.4718878-156-234196451638054/AnsiballZ_setup.py'
Nov 24 12:50:25 compute-1 sudo[32671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:26 compute-1 python3.9[32673]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:50:26 compute-1 sudo[32671]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:26 compute-1 sudo[32828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddumbmaurewaiewwkisciwaewcvltlxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988626.4682477-172-241480182435286/AnsiballZ_file.py'
Nov 24 12:50:26 compute-1 sudo[32828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:26 compute-1 python3.9[32830]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:50:26 compute-1 sudo[32828]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:27 compute-1 sudo[32980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyntstgymqisupxvxfqdvrthbymykoan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988627.232506-190-47106935976303/AnsiballZ_file.py'
Nov 24 12:50:27 compute-1 sudo[32980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:27 compute-1 python3.9[32982]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:50:27 compute-1 sudo[32980]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:28 compute-1 python3.9[33132]: ansible-ansible.builtin.service_facts Invoked
Nov 24 12:50:32 compute-1 python3.9[33385]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:50:32 compute-1 python3.9[33535]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:50:34 compute-1 python3.9[33689]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:50:34 compute-1 sudo[33845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibgxzwhckdwygkewzjjkosysfiniaaun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988634.5440478-286-263362919993536/AnsiballZ_setup.py'
Nov 24 12:50:34 compute-1 sudo[33845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:35 compute-1 python3.9[33847]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:50:35 compute-1 sudo[33845]: pam_unix(sudo:session): session closed for user root
Nov 24 12:50:35 compute-1 sudo[33929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syoofelaccshfcqaklxsvigoctcmvton ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988634.5440478-286-263362919993536/AnsiballZ_dnf.py'
Nov 24 12:50:35 compute-1 sudo[33929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:50:36 compute-1 python3.9[33931]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:51:06 compute-1 sshd-session[34077]: Invalid user sol from 45.148.10.240 port 38844
Nov 24 12:51:06 compute-1 sshd-session[34077]: Connection closed by invalid user sol 45.148.10.240 port 38844 [preauth]
Nov 24 12:51:22 compute-1 systemd[1]: Reloading.
Nov 24 12:51:22 compute-1 systemd-rc-local-generator[34134]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:51:22 compute-1 systemd[1]: Starting dnf makecache...
Nov 24 12:51:22 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 24 12:51:22 compute-1 dnf[34142]: Failed determining last makecache time.
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-barbican-42b4c41831408a8e323 170 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 195 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-cinder-1c00d6490d88e436f26ef  95 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-python-stevedore-c4acc5639fd2329372142 191 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-python-observabilityclient-2f31846d73c 173 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-os-net-config-bbae2ed8a159b0435a473f38 184 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 192 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-python-designate-tests-tempest-347fdbc 199 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-glance-1fd12c29b339f30fe823e 207 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 198 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-manila-3c01b7181572c95dac462 192 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-python-whitebox-neutron-tests-tempest- 189 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-octavia-ba397f07a7331190208c 190 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-watcher-c014f81a8647287f6dcc 199 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-python-tcib-1124124ec06aadbac34f0d340b 195 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 195 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-swift-dc98a8463506ac520c469a 200 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-python-tempestconf-8515371b7cceebd4282 155 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: delorean-openstack-heat-ui-013accbfd179753bc3f0 177 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: CentOS Stream 9 - BaseOS                         78 kB/s | 7.3 kB     00:00
Nov 24 12:51:23 compute-1 systemd[1]: Reloading.
Nov 24 12:51:23 compute-1 systemd-rc-local-generator[34189]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:51:23 compute-1 dnf[34142]: CentOS Stream 9 - AppStream                      77 kB/s | 7.4 kB     00:00
Nov 24 12:51:23 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 24 12:51:23 compute-1 dnf[34142]: CentOS Stream 9 - CRB                            78 kB/s | 7.2 kB     00:00
Nov 24 12:51:23 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 24 12:51:23 compute-1 systemd[1]: Reloading.
Nov 24 12:51:23 compute-1 systemd-rc-local-generator[34236]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:51:23 compute-1 dnf[34142]: CentOS Stream 9 - Extras packages                71 kB/s | 8.3 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: dlrn-antelope-testing                           137 kB/s | 3.0 kB     00:00
Nov 24 12:51:23 compute-1 dnf[34142]: dlrn-antelope-build-deps                        146 kB/s | 3.0 kB     00:00
Nov 24 12:51:24 compute-1 dnf[34142]: centos9-rabbitmq                                 47 kB/s | 3.0 kB     00:00
Nov 24 12:51:24 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 24 12:51:24 compute-1 dnf[34142]: centos9-storage                                  43 kB/s | 3.0 kB     00:00
Nov 24 12:51:24 compute-1 dnf[34142]: centos9-opstools                                 44 kB/s | 3.0 kB     00:00
Nov 24 12:51:24 compute-1 dnf[34142]: NFV SIG OpenvSwitch                              65 kB/s | 3.0 kB     00:00
Nov 24 12:51:24 compute-1 dnf[34142]: repo-setup-centos-appstream                     203 kB/s | 4.4 kB     00:00
Nov 24 12:51:24 compute-1 dnf[34142]: repo-setup-centos-baseos                        110 kB/s | 3.9 kB     00:00
Nov 24 12:51:24 compute-1 dnf[34142]: repo-setup-centos-highavailability              155 kB/s | 3.9 kB     00:00
Nov 24 12:51:24 compute-1 dnf[34142]: repo-setup-centos-powertools                    206 kB/s | 4.3 kB     00:00
Nov 24 12:51:24 compute-1 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Nov 24 12:51:24 compute-1 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Nov 24 12:51:24 compute-1 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Nov 24 12:51:24 compute-1 dnf[34142]: Extra Packages for Enterprise Linux 9 - x86_64  207 kB/s |  32 kB     00:00
Nov 24 12:51:25 compute-1 dnf[34142]: Metadata cache created.
Nov 24 12:51:25 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 24 12:51:25 compute-1 systemd[1]: Finished dnf makecache.
Nov 24 12:51:25 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.656s CPU time.
Nov 24 12:51:27 compute-1 sshd-session[34280]: Connection closed by authenticating user root 193.32.162.145 port 47326 [preauth]
Nov 24 12:52:30 compute-1 kernel: SELinux:  Converting 2718 SID table entries...
Nov 24 12:52:30 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 12:52:30 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 24 12:52:30 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 12:52:30 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 24 12:52:30 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 12:52:30 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 12:52:30 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 12:52:30 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 24 12:52:30 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 12:52:30 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 24 12:52:30 compute-1 systemd[1]: Reloading.
Nov 24 12:52:30 compute-1 systemd-rc-local-generator[34570]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:52:31 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 12:52:31 compute-1 sudo[33929]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:31 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 12:52:31 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 24 12:52:31 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.191s CPU time.
Nov 24 12:52:31 compute-1 systemd[1]: run-r532713fc4105497c9e0d44fbe89dea72.service: Deactivated successfully.
Nov 24 12:52:36 compute-1 sudo[35491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npbcqgwnbzwswwkuiwesecvopgnpdvol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988756.3965476-310-7933331505638/AnsiballZ_command.py'
Nov 24 12:52:36 compute-1 sudo[35491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:36 compute-1 python3.9[35493]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:52:38 compute-1 sudo[35491]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:38 compute-1 sudo[35772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-patavyfvtfvgexdformrszpzsqttbzng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988758.2527542-326-219946861305528/AnsiballZ_selinux.py'
Nov 24 12:52:38 compute-1 sudo[35772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:39 compute-1 python3.9[35774]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 24 12:52:39 compute-1 sudo[35772]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:40 compute-1 sudo[35924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kstdpdkrudkullzaxpjuzbdzvtwmlbgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988759.6932702-348-174236923882515/AnsiballZ_command.py'
Nov 24 12:52:40 compute-1 sudo[35924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:40 compute-1 python3.9[35926]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 24 12:52:41 compute-1 sudo[35924]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:41 compute-1 sudo[36077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxwjjpirozvmyxpjxufuhucfvbgjsydp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988761.4543128-364-63996401597148/AnsiballZ_file.py'
Nov 24 12:52:41 compute-1 sudo[36077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:42 compute-1 python3.9[36079]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:52:42 compute-1 sudo[36077]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:43 compute-1 sudo[36229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fftbwdoyrttahfsnenizktjqrhskrwdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988762.568275-381-187640061838914/AnsiballZ_mount.py'
Nov 24 12:52:43 compute-1 sudo[36229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:43 compute-1 python3.9[36231]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 24 12:52:43 compute-1 sudo[36229]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:44 compute-1 sudo[36381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nftalnyispviysfardlxvhxbvlssupcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988764.138172-436-169852049672872/AnsiballZ_file.py'
Nov 24 12:52:44 compute-1 sudo[36381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:44 compute-1 python3.9[36383]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:52:44 compute-1 sudo[36381]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:45 compute-1 sudo[36533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhvvieqcycrhbhmsxsfclrpvncvlvhvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988764.9682217-452-94555584972450/AnsiballZ_stat.py'
Nov 24 12:52:45 compute-1 sudo[36533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:45 compute-1 python3.9[36535]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:52:45 compute-1 sudo[36533]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:45 compute-1 sudo[36656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrsfurwqbvqmxakyhtmeeytumpjdzyqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988764.9682217-452-94555584972450/AnsiballZ_copy.py'
Nov 24 12:52:45 compute-1 sudo[36656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:49 compute-1 python3.9[36658]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763988764.9682217-452-94555584972450/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7555e4abd24fd50381399b8a25576eb603fb2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:52:49 compute-1 sudo[36656]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:50 compute-1 sudo[36808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cedurxcdkskdpfduklzhbjlhtifusxqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988769.8681295-500-244442046218253/AnsiballZ_stat.py'
Nov 24 12:52:50 compute-1 sudo[36808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:50 compute-1 python3.9[36810]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:52:50 compute-1 sudo[36808]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:52 compute-1 sudo[36960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vozgwybwwwzeewlajryorhgeocxqsigp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988772.4460611-516-21199261591158/AnsiballZ_command.py'
Nov 24 12:52:52 compute-1 sudo[36960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:52 compute-1 python3.9[36962]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:52:52 compute-1 sudo[36960]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:53 compute-1 sudo[37113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcbkryroujwujgfqejjiphkaaaecimyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988773.1611197-532-165317701907591/AnsiballZ_file.py'
Nov 24 12:52:53 compute-1 sudo[37113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:53 compute-1 python3.9[37115]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:52:53 compute-1 sudo[37113]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:54 compute-1 sudo[37265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrwhngzqqmflmbqgmyusfgmaosqcqdas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988774.0589416-554-226480075156432/AnsiballZ_getent.py'
Nov 24 12:52:54 compute-1 sudo[37265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:54 compute-1 python3.9[37267]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 24 12:52:54 compute-1 sudo[37265]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:54 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 12:52:55 compute-1 sudo[37419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijjssxwrgrctqkpcjjeefiwtofdrtmkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988774.8283257-570-118331335644549/AnsiballZ_group.py'
Nov 24 12:52:55 compute-1 sudo[37419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:55 compute-1 python3.9[37421]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 12:52:55 compute-1 groupadd[37422]: group added to /etc/group: name=qemu, GID=107
Nov 24 12:52:55 compute-1 groupadd[37422]: group added to /etc/gshadow: name=qemu
Nov 24 12:52:55 compute-1 groupadd[37422]: new group: name=qemu, GID=107
Nov 24 12:52:55 compute-1 sudo[37419]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:56 compute-1 sudo[37577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlohblmfxxbhjgouxmfxkxoqccmynarp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988775.7824886-586-130291122142822/AnsiballZ_user.py'
Nov 24 12:52:56 compute-1 sudo[37577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:56 compute-1 python3.9[37579]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 12:52:56 compute-1 useradd[37581]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 12:52:56 compute-1 sudo[37577]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:57 compute-1 sudo[37737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdlriwkjeupwznspcbqqeriahwgelmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988776.888107-602-265337468524405/AnsiballZ_getent.py'
Nov 24 12:52:57 compute-1 sudo[37737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:57 compute-1 python3.9[37739]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 24 12:52:57 compute-1 sudo[37737]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:57 compute-1 sudo[37890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlckdynfxtrunlodwdnusquhabkcwgkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988777.5655875-618-245910699984138/AnsiballZ_group.py'
Nov 24 12:52:57 compute-1 sudo[37890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:58 compute-1 python3.9[37892]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 12:52:58 compute-1 groupadd[37893]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 24 12:52:58 compute-1 groupadd[37893]: group added to /etc/gshadow: name=hugetlbfs
Nov 24 12:52:58 compute-1 groupadd[37893]: new group: name=hugetlbfs, GID=42477
Nov 24 12:52:58 compute-1 sudo[37890]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:58 compute-1 sudo[38048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abhnyqjpctdpouqsuyfxcmwwjgkpbekb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988778.3520274-636-35308305450793/AnsiballZ_file.py'
Nov 24 12:52:58 compute-1 sudo[38048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:58 compute-1 python3.9[38050]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 24 12:52:58 compute-1 sudo[38048]: pam_unix(sudo:session): session closed for user root
Nov 24 12:52:59 compute-1 sudo[38200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbqnwpzdzbcknmyypwmspnhcqwcsagid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988779.22267-658-102260083169232/AnsiballZ_dnf.py'
Nov 24 12:52:59 compute-1 sudo[38200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:52:59 compute-1 python3.9[38202]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:53:01 compute-1 sudo[38200]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:01 compute-1 sudo[38353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmvavmvciglafgajwnsuqymyebwdhhra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988781.6562493-674-26362329700384/AnsiballZ_file.py'
Nov 24 12:53:01 compute-1 sudo[38353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:02 compute-1 python3.9[38355]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:53:02 compute-1 sudo[38353]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:02 compute-1 sudo[38505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apodsjkfbanchamznrtlwykevafyesfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988782.3341362-690-248426474978876/AnsiballZ_stat.py'
Nov 24 12:53:02 compute-1 sudo[38505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:02 compute-1 python3.9[38507]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:53:02 compute-1 sudo[38505]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:03 compute-1 sudo[38628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clsafacirbhtrztkxobnaljonohsbnng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988782.3341362-690-248426474978876/AnsiballZ_copy.py'
Nov 24 12:53:03 compute-1 sudo[38628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:03 compute-1 python3.9[38630]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763988782.3341362-690-248426474978876/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:53:03 compute-1 sudo[38628]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:04 compute-1 sudo[38780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wftmukiapexonsuwnyzotqcjumcelssn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988783.5817637-721-201432330892830/AnsiballZ_systemd.py'
Nov 24 12:53:04 compute-1 sudo[38780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:04 compute-1 python3.9[38782]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 12:53:04 compute-1 systemd[1]: Starting Load Kernel Modules...
Nov 24 12:53:04 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 24 12:53:04 compute-1 kernel: Bridge firewalling registered
Nov 24 12:53:04 compute-1 systemd-modules-load[38786]: Inserted module 'br_netfilter'
Nov 24 12:53:04 compute-1 systemd[1]: Finished Load Kernel Modules.
Nov 24 12:53:04 compute-1 sudo[38780]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:05 compute-1 sudo[38940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyxoyzmmhyvyeavxrjgrhslnxcilbjls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988784.8812852-736-166328846406890/AnsiballZ_stat.py'
Nov 24 12:53:05 compute-1 sudo[38940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:05 compute-1 python3.9[38942]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:53:05 compute-1 sudo[38940]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:05 compute-1 sudo[39063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzpllzgjwioxqgcsglzkcgfujhddmlkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988784.8812852-736-166328846406890/AnsiballZ_copy.py'
Nov 24 12:53:05 compute-1 sudo[39063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:06 compute-1 python3.9[39065]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763988784.8812852-736-166328846406890/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:53:06 compute-1 sudo[39063]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:06 compute-1 sudo[39215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjyviyqvdirvppxxfrhlrxmkpnrgshkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988786.50544-772-275666013245026/AnsiballZ_dnf.py'
Nov 24 12:53:06 compute-1 sudo[39215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:06 compute-1 python3.9[39217]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:53:08 compute-1 sshd-session[39219]: Invalid user solana from 45.148.10.240 port 35570
Nov 24 12:53:08 compute-1 sshd-session[39219]: Connection closed by invalid user solana 45.148.10.240 port 35570 [preauth]
Nov 24 12:53:11 compute-1 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Nov 24 12:53:11 compute-1 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Nov 24 12:53:12 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 12:53:12 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 24 12:53:12 compute-1 systemd[1]: Reloading.
Nov 24 12:53:12 compute-1 systemd-rc-local-generator[39277]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:53:12 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 12:53:12 compute-1 sudo[39215]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:13 compute-1 python3.9[40725]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:53:14 compute-1 python3.9[41811]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 24 12:53:15 compute-1 python3.9[42550]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:53:16 compute-1 sudo[43376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbetmyskrkywllrhczmauoroqjgjhsqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988795.7470655-850-269215534079835/AnsiballZ_command.py'
Nov 24 12:53:16 compute-1 sudo[43376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:16 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 12:53:16 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 24 12:53:16 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.076s CPU time.
Nov 24 12:53:16 compute-1 systemd[1]: run-rf1419c3638e54526991dcb8c14527fa0.service: Deactivated successfully.
Nov 24 12:53:16 compute-1 python3.9[43378]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:53:16 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 24 12:53:16 compute-1 systemd[1]: Starting Authorization Manager...
Nov 24 12:53:16 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 24 12:53:16 compute-1 polkitd[43597]: Started polkitd version 0.117
Nov 24 12:53:16 compute-1 polkitd[43597]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 12:53:16 compute-1 polkitd[43597]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 12:53:16 compute-1 polkitd[43597]: Finished loading, compiling and executing 2 rules
Nov 24 12:53:16 compute-1 polkitd[43597]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 24 12:53:16 compute-1 systemd[1]: Started Authorization Manager.
Nov 24 12:53:16 compute-1 sudo[43376]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:17 compute-1 sudo[43765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bepgaizdqqzrbfxjidqhwsufqkkjvtjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988797.3368316-868-61072268585496/AnsiballZ_systemd.py'
Nov 24 12:53:17 compute-1 sudo[43765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:17 compute-1 python3.9[43767]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:53:18 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 24 12:53:18 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Nov 24 12:53:18 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 24 12:53:18 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 24 12:53:18 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 24 12:53:18 compute-1 sudo[43765]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:19 compute-1 python3.9[43929]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 24 12:53:22 compute-1 sudo[44079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsfwlhicmhjhebxpoqeheoxbmkuwhdga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988801.6761372-982-24855272912244/AnsiballZ_systemd.py'
Nov 24 12:53:22 compute-1 sudo[44079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:22 compute-1 python3.9[44081]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:53:22 compute-1 systemd[1]: Reloading.
Nov 24 12:53:22 compute-1 systemd-rc-local-generator[44113]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:53:22 compute-1 sudo[44079]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:23 compute-1 sudo[44269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xngnnfxvoitosheostbgazpzysbnuijx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988802.733384-982-56381393816004/AnsiballZ_systemd.py'
Nov 24 12:53:23 compute-1 sudo[44269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:23 compute-1 python3.9[44271]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:53:23 compute-1 systemd[1]: Reloading.
Nov 24 12:53:23 compute-1 systemd-rc-local-generator[44297]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:53:23 compute-1 sudo[44269]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:24 compute-1 sudo[44458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izizuyerwhiseshewzdgbiwrzswpwjos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988803.8197823-1014-95190289515264/AnsiballZ_command.py'
Nov 24 12:53:24 compute-1 sudo[44458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:24 compute-1 python3.9[44460]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:53:24 compute-1 sudo[44458]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:24 compute-1 sudo[44611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptnpodbrquplhxlviccthzsszuzwlkxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988804.5860903-1030-7637925902301/AnsiballZ_command.py'
Nov 24 12:53:24 compute-1 sudo[44611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:25 compute-1 python3.9[44613]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:53:25 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 24 12:53:25 compute-1 sudo[44611]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:25 compute-1 sudo[44764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltepjziaaccqsyditxiusmbyjmkinggs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988805.2974186-1046-189505352358490/AnsiballZ_command.py'
Nov 24 12:53:25 compute-1 sudo[44764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:25 compute-1 python3.9[44766]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:53:27 compute-1 sudo[44764]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:27 compute-1 sudo[44926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bauejncbwlwcyxnjrszcirxmjxoziare ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988807.447307-1062-100813710693947/AnsiballZ_command.py'
Nov 24 12:53:27 compute-1 sudo[44926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:27 compute-1 python3.9[44928]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:53:27 compute-1 sudo[44926]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:28 compute-1 sudo[45079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvkttusjebbszpsypcpmditibnecyqql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988808.118975-1078-100487798461741/AnsiballZ_systemd.py'
Nov 24 12:53:28 compute-1 sudo[45079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:28 compute-1 python3.9[45081]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 12:53:28 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 24 12:53:28 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Nov 24 12:53:28 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Nov 24 12:53:28 compute-1 systemd[1]: Starting Apply Kernel Variables...
Nov 24 12:53:28 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 24 12:53:28 compute-1 systemd[1]: Finished Apply Kernel Variables.
Nov 24 12:53:28 compute-1 sudo[45079]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:29 compute-1 sshd-session[31465]: Connection closed by 192.168.122.30 port 42704
Nov 24 12:53:29 compute-1 sshd-session[31462]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:53:29 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Nov 24 12:53:29 compute-1 systemd[1]: session-11.scope: Consumed 2min 12.323s CPU time.
Nov 24 12:53:29 compute-1 systemd-logind[815]: Session 11 logged out. Waiting for processes to exit.
Nov 24 12:53:29 compute-1 systemd-logind[815]: Removed session 11.
Nov 24 12:53:35 compute-1 sshd-session[45111]: Accepted publickey for zuul from 192.168.122.30 port 59740 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:53:35 compute-1 systemd-logind[815]: New session 12 of user zuul.
Nov 24 12:53:35 compute-1 systemd[1]: Started Session 12 of User zuul.
Nov 24 12:53:35 compute-1 sshd-session[45111]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:53:36 compute-1 python3.9[45264]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:53:37 compute-1 python3.9[45418]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:53:38 compute-1 sudo[45572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taxyhlahetkvbumgdrcqnaqbtdoajart ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988817.9010215-81-81142121624698/AnsiballZ_command.py'
Nov 24 12:53:38 compute-1 sudo[45572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:38 compute-1 python3.9[45574]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:53:38 compute-1 sudo[45572]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:39 compute-1 python3.9[45725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:53:40 compute-1 sudo[45879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbljpmkcnqdmidevcvtbkwgsmvfvqvak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988819.9488053-121-277932304639110/AnsiballZ_setup.py'
Nov 24 12:53:40 compute-1 sudo[45879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:40 compute-1 python3.9[45881]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:53:40 compute-1 sudo[45879]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:41 compute-1 sudo[45963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzrsrsjgvlsvwnrylnglgstxlheuyogk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988819.9488053-121-277932304639110/AnsiballZ_dnf.py'
Nov 24 12:53:41 compute-1 sudo[45963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:41 compute-1 python3.9[45965]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:53:42 compute-1 sudo[45963]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:43 compute-1 sudo[46116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-askcdwfntopdxeudevtxtkgjzhpjahln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988822.788012-145-203988067428596/AnsiballZ_setup.py'
Nov 24 12:53:43 compute-1 sudo[46116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:43 compute-1 python3.9[46118]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:53:43 compute-1 sudo[46116]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:44 compute-1 sudo[46287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfhwcozrszknrxqmoajormmfkxnjjwys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988823.8198812-167-5106492345889/AnsiballZ_file.py'
Nov 24 12:53:44 compute-1 sudo[46287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:44 compute-1 python3.9[46289]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:53:44 compute-1 sudo[46287]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:45 compute-1 sudo[46439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xolyzshotgsyipmqpvrrkklycibjatwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988824.6828454-183-219003339705342/AnsiballZ_command.py'
Nov 24 12:53:45 compute-1 sudo[46439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:45 compute-1 python3.9[46441]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:53:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat1204669628-merged.mount: Deactivated successfully.
Nov 24 12:53:45 compute-1 podman[46442]: 2025-11-24 12:53:45.289548625 +0000 UTC m=+0.054538766 system refresh
Nov 24 12:53:45 compute-1 sudo[46439]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:45 compute-1 sudo[46602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjshdhorfglrahllofjwjozgynupmnvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988825.4823825-199-244281443288118/AnsiballZ_stat.py'
Nov 24 12:53:45 compute-1 sudo[46602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:46 compute-1 python3.9[46604]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:53:46 compute-1 sudo[46602]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:46 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:53:46 compute-1 sudo[46725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdodsefbabiwtfpinltqyjghojdfstla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988825.4823825-199-244281443288118/AnsiballZ_copy.py'
Nov 24 12:53:46 compute-1 sudo[46725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:46 compute-1 python3.9[46727]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763988825.4823825-199-244281443288118/.source.json follow=False _original_basename=podman_network_config.j2 checksum=5c6cd4489074c4db173e7514d4e7684084a5d74d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:53:46 compute-1 sudo[46725]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:47 compute-1 sudo[46877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbdxukhwmfkeifebittwwoylkacwyabq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988826.9833255-229-220979274469893/AnsiballZ_stat.py'
Nov 24 12:53:47 compute-1 sudo[46877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:47 compute-1 python3.9[46879]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:53:47 compute-1 sudo[46877]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:47 compute-1 sudo[47000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omzsgsbjalbzwhiudzlgudrsjzpjyizg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988826.9833255-229-220979274469893/AnsiballZ_copy.py'
Nov 24 12:53:47 compute-1 sudo[47000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:47 compute-1 python3.9[47002]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763988826.9833255-229-220979274469893/.source.conf follow=False _original_basename=registries.conf.j2 checksum=8c73fbc0d7cddf5b89d40cde842a385025fa8102 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:53:47 compute-1 sudo[47000]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:48 compute-1 sudo[47152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puipbugkcnaibhuhczksdnvpgmvcvfnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988828.282576-261-239795047657661/AnsiballZ_ini_file.py'
Nov 24 12:53:48 compute-1 sudo[47152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:48 compute-1 python3.9[47154]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:53:48 compute-1 sudo[47152]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:49 compute-1 sudo[47304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrfqvnsgeyuoxbmzcogpndsjjweuwktr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988829.0233278-261-7726014122203/AnsiballZ_ini_file.py'
Nov 24 12:53:49 compute-1 sudo[47304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:49 compute-1 python3.9[47306]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:53:49 compute-1 sudo[47304]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:49 compute-1 sudo[47456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxkrtzagwnquskpruepfdepldssxcymv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988829.6104043-261-267969159238257/AnsiballZ_ini_file.py'
Nov 24 12:53:49 compute-1 sudo[47456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:50 compute-1 python3.9[47458]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:53:50 compute-1 sudo[47456]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:50 compute-1 sudo[47608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwtdbfxicydtrpwxprythxkzanemdjwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988830.2612312-261-131949497910743/AnsiballZ_ini_file.py'
Nov 24 12:53:50 compute-1 sudo[47608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:50 compute-1 python3.9[47610]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:53:50 compute-1 sudo[47608]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:52 compute-1 python3.9[47760]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:53:53 compute-1 sudo[47912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wthsdfrfyptzdaozzshkodqduxpghmqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988832.7718852-341-183601366003257/AnsiballZ_dnf.py'
Nov 24 12:53:53 compute-1 sudo[47912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:53 compute-1 python3.9[47914]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:53:54 compute-1 sudo[47912]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:54 compute-1 sudo[48065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwgekhwmhoxgoahngozwwnovxeprrhub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988834.7215214-357-60137156449465/AnsiballZ_dnf.py'
Nov 24 12:53:54 compute-1 sudo[48065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:55 compute-1 python3.9[48067]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:53:56 compute-1 sudo[48065]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:57 compute-1 sudo[48225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbnolsvjesuzavmvrxtvpvjttfplihev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988837.3974009-377-63000745397483/AnsiballZ_dnf.py'
Nov 24 12:53:57 compute-1 sudo[48225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:53:57 compute-1 python3.9[48227]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:53:59 compute-1 sudo[48225]: pam_unix(sudo:session): session closed for user root
Nov 24 12:53:59 compute-1 sudo[48378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjewviwplruvbhxpcozqtuzzqiyiijzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988839.5961194-395-243359853046217/AnsiballZ_dnf.py'
Nov 24 12:53:59 compute-1 sudo[48378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:00 compute-1 python3.9[48380]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:54:01 compute-1 sudo[48378]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:02 compute-1 sudo[48531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiqrronrujpxcgsuansmzaisbaryxkgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988841.8210766-417-214923930521589/AnsiballZ_dnf.py'
Nov 24 12:54:02 compute-1 sudo[48531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:02 compute-1 python3.9[48533]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:54:03 compute-1 sudo[48531]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:04 compute-1 sudo[48687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcphrfwzjpojpjgytjygtoqmgoyhvpyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988844.0743144-433-241974748257240/AnsiballZ_dnf.py'
Nov 24 12:54:04 compute-1 sudo[48687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:04 compute-1 python3.9[48689]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:54:07 compute-1 sudo[48687]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:08 compute-1 sudo[48857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aedciedzcsqwdvhnxrtrtbrcasuvikts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988847.8402283-451-70036471990695/AnsiballZ_dnf.py'
Nov 24 12:54:08 compute-1 sudo[48857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:08 compute-1 python3.9[48859]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:54:09 compute-1 sudo[48857]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:10 compute-1 sudo[49010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmshguqjxqawzzlrgormezrwojzgavwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988849.9126182-469-168187804478343/AnsiballZ_dnf.py'
Nov 24 12:54:10 compute-1 sudo[49010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:10 compute-1 python3.9[49012]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:54:23 compute-1 sudo[49010]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:28 compute-1 sudo[49347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thdqtckduhfvbhqelgfnzhaapkiwzozx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988868.0502954-487-215560421562598/AnsiballZ_dnf.py'
Nov 24 12:54:28 compute-1 sudo[49347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:28 compute-1 python3.9[49349]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:54:29 compute-1 sudo[49347]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:30 compute-1 sudo[49503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmoygxbbfgytdekaazniiofkesvefczs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988870.2910082-509-40802548046382/AnsiballZ_file.py'
Nov 24 12:54:30 compute-1 sudo[49503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:30 compute-1 python3.9[49505]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:54:30 compute-1 sudo[49503]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:31 compute-1 sudo[49678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skbbfwubkeworvcnhlvbeasamcjkvcqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988871.0859916-525-270667363464181/AnsiballZ_stat.py'
Nov 24 12:54:31 compute-1 sudo[49678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:31 compute-1 python3.9[49680]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:54:31 compute-1 sudo[49678]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:32 compute-1 sudo[49801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlaaiiqyqujsuvbbuafhtouiwjlypgqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988871.0859916-525-270667363464181/AnsiballZ_copy.py'
Nov 24 12:54:32 compute-1 sudo[49801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:32 compute-1 python3.9[49803]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763988871.0859916-525-270667363464181/.source.json _original_basename=.miq3hzg7 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:54:32 compute-1 sudo[49801]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:33 compute-1 sudo[49953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izutttqvbmzkjdtbwdzurcxeltozbeik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988872.5583036-561-130019337676875/AnsiballZ_podman_image.py'
Nov 24 12:54:33 compute-1 sudo[49953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:33 compute-1 python3.9[49955]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 12:54:33 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2801553676-lower\x2dmapped.mount: Deactivated successfully.
Nov 24 12:54:38 compute-1 podman[49968]: 2025-11-24 12:54:38.3459241 +0000 UTC m=+4.984848104 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 24 12:54:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:38 compute-1 sudo[49953]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:39 compute-1 sudo[50261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqselqtxzppyvtcjroepokmnlknttbcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988879.637281-583-83604488876485/AnsiballZ_podman_image.py'
Nov 24 12:54:39 compute-1 sudo[50261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:40 compute-1 python3.9[50263]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 12:54:40 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:41 compute-1 sshd-session[50287]: Invalid user ubuntu from 193.32.162.145 port 60846
Nov 24 12:54:41 compute-1 sshd-session[50287]: Connection closed by invalid user ubuntu 193.32.162.145 port 60846 [preauth]
Nov 24 12:54:50 compute-1 podman[50274]: 2025-11-24 12:54:50.772720916 +0000 UTC m=+10.595439028 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 12:54:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:50 compute-1 sudo[50261]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:51 compute-1 sudo[50572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxyxyvoakwygrtgzfpuhxjvizovvbzug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988891.2901912-603-174733591155926/AnsiballZ_podman_image.py'
Nov 24 12:54:51 compute-1 sudo[50572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:51 compute-1 python3.9[50574]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 12:54:51 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:52 compute-1 podman[50586]: 2025-11-24 12:54:52.859736081 +0000 UTC m=+1.034284395 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 24 12:54:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:54:53 compute-1 sudo[50572]: pam_unix(sudo:session): session closed for user root
Nov 24 12:54:53 compute-1 sudo[50822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzysiqbijijbndoqqltimnwfttmcutqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988893.3658013-621-194735267761437/AnsiballZ_podman_image.py'
Nov 24 12:54:53 compute-1 sudo[50822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:54:53 compute-1 python3.9[50824]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 12:54:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:05 compute-1 sshd-session[50898]: Invalid user solana from 45.148.10.240 port 41248
Nov 24 12:55:05 compute-1 sshd-session[50898]: Connection closed by invalid user solana 45.148.10.240 port 41248 [preauth]
Nov 24 12:55:07 compute-1 podman[50837]: 2025-11-24 12:55:07.008454638 +0000 UTC m=+13.086169434 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 24 12:55:07 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:07 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:07 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:07 compute-1 sudo[50822]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:07 compute-1 sudo[51097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kckqahzlwqxmicacxcnycybxvuhlcrge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988907.508315-643-86825403391376/AnsiballZ_podman_image.py'
Nov 24 12:55:07 compute-1 sudo[51097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:07 compute-1 python3.9[51099]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 12:55:16 compute-1 podman[51112]: 2025-11-24 12:55:16.788165328 +0000 UTC m=+8.769598449 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 24 12:55:16 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:16 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:16 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:17 compute-1 sudo[51097]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:17 compute-1 sudo[51372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxmkdrqpcrowxqlnpiyqkxtyrsfckvlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988917.1637197-643-265422249232506/AnsiballZ_podman_image.py'
Nov 24 12:55:17 compute-1 sudo[51372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:17 compute-1 python3.9[51374]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 12:55:19 compute-1 podman[51386]: 2025-11-24 12:55:19.726353195 +0000 UTC m=+2.105625457 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 24 12:55:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:55:19 compute-1 sudo[51372]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:20 compute-1 sshd-session[45114]: Connection closed by 192.168.122.30 port 59740
Nov 24 12:55:20 compute-1 sshd-session[45111]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:55:20 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Nov 24 12:55:20 compute-1 systemd[1]: session-12.scope: Consumed 1min 43.916s CPU time.
Nov 24 12:55:20 compute-1 systemd-logind[815]: Session 12 logged out. Waiting for processes to exit.
Nov 24 12:55:20 compute-1 systemd-logind[815]: Removed session 12.
Nov 24 12:55:25 compute-1 sshd-session[51529]: Accepted publickey for zuul from 192.168.122.30 port 46820 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:55:25 compute-1 systemd-logind[815]: New session 13 of user zuul.
Nov 24 12:55:25 compute-1 systemd[1]: Started Session 13 of User zuul.
Nov 24 12:55:25 compute-1 sshd-session[51529]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:55:26 compute-1 python3.9[51682]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:55:27 compute-1 sudo[51836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufbvoatctinkyponmzlfrqnfuyggclal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988927.5937192-53-222306260191564/AnsiballZ_getent.py'
Nov 24 12:55:27 compute-1 sudo[51836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:28 compute-1 python3.9[51838]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 24 12:55:28 compute-1 sudo[51836]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:28 compute-1 sudo[51989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joqlyoluentykrylyiukptmaviipfzmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988928.4418116-69-105348880049935/AnsiballZ_group.py'
Nov 24 12:55:28 compute-1 sudo[51989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:29 compute-1 python3.9[51991]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 12:55:29 compute-1 groupadd[51992]: group added to /etc/group: name=openvswitch, GID=42476
Nov 24 12:55:29 compute-1 groupadd[51992]: group added to /etc/gshadow: name=openvswitch
Nov 24 12:55:29 compute-1 groupadd[51992]: new group: name=openvswitch, GID=42476
Nov 24 12:55:29 compute-1 sudo[51989]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:30 compute-1 sudo[52147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gozrjpjqvtscfjlrjheksyurcexcoyit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988929.6925125-85-46108486973324/AnsiballZ_user.py'
Nov 24 12:55:30 compute-1 sudo[52147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:30 compute-1 python3.9[52149]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 12:55:30 compute-1 useradd[52151]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 12:55:30 compute-1 useradd[52151]: add 'openvswitch' to group 'hugetlbfs'
Nov 24 12:55:30 compute-1 useradd[52151]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 24 12:55:30 compute-1 sudo[52147]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:31 compute-1 sudo[52307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbtidcqxgxoypnvvgalopbpvqqdlydfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988931.0468602-105-248202362521510/AnsiballZ_setup.py'
Nov 24 12:55:31 compute-1 sudo[52307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:31 compute-1 python3.9[52309]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:55:31 compute-1 sudo[52307]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:32 compute-1 sudo[52391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaucloodigbrupjzvmnbdnllqwkxyuaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988931.0468602-105-248202362521510/AnsiballZ_dnf.py'
Nov 24 12:55:32 compute-1 sudo[52391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:32 compute-1 python3.9[52393]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:55:33 compute-1 sudo[52391]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:34 compute-1 sudo[52553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbeqxgoqbmuwublvwlxmiqstrcpbojxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988934.3026512-133-190894459591575/AnsiballZ_dnf.py'
Nov 24 12:55:34 compute-1 sudo[52553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:34 compute-1 python3.9[52555]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:55:47 compute-1 kernel: SELinux:  Converting 2731 SID table entries...
Nov 24 12:55:47 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 12:55:47 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 24 12:55:47 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 12:55:47 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 24 12:55:47 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 12:55:47 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 12:55:47 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 12:55:47 compute-1 groupadd[52578]: group added to /etc/group: name=unbound, GID=993
Nov 24 12:55:47 compute-1 groupadd[52578]: group added to /etc/gshadow: name=unbound
Nov 24 12:55:47 compute-1 groupadd[52578]: new group: name=unbound, GID=993
Nov 24 12:55:48 compute-1 useradd[52585]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 24 12:55:48 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 24 12:55:48 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 24 12:55:49 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 12:55:49 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 24 12:55:49 compute-1 systemd[1]: Reloading.
Nov 24 12:55:49 compute-1 systemd-rc-local-generator[53084]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:55:49 compute-1 systemd-sysv-generator[53087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:55:49 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 12:55:50 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 12:55:50 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 24 12:55:50 compute-1 systemd[1]: run-r972ca82593c94aa887f0c325f1b3e51e.service: Deactivated successfully.
Nov 24 12:55:50 compute-1 sudo[52553]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:50 compute-1 sudo[53651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etnqvlvadoycftjrpfxqnnfputclhyoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988950.3498-149-118057407430748/AnsiballZ_systemd.py'
Nov 24 12:55:50 compute-1 sudo[53651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:51 compute-1 python3.9[53653]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 12:55:51 compute-1 systemd[1]: Reloading.
Nov 24 12:55:51 compute-1 systemd-rc-local-generator[53685]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:55:51 compute-1 systemd-sysv-generator[53688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:55:51 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Nov 24 12:55:51 compute-1 chown[53696]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 24 12:55:51 compute-1 ovs-ctl[53701]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 24 12:55:51 compute-1 ovs-ctl[53701]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 24 12:55:51 compute-1 ovs-ctl[53701]: Starting ovsdb-server [  OK  ]
Nov 24 12:55:51 compute-1 ovs-vsctl[53750]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 24 12:55:51 compute-1 ovs-vsctl[53769]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"971456df-f9ba-4c8a-bc15-c9feb573d541\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 24 12:55:51 compute-1 ovs-ctl[53701]: Configuring Open vSwitch system IDs [  OK  ]
Nov 24 12:55:51 compute-1 ovs-ctl[53701]: Enabling remote OVSDB managers [  OK  ]
Nov 24 12:55:51 compute-1 ovs-vsctl[53775]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 24 12:55:51 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Nov 24 12:55:51 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 24 12:55:51 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 24 12:55:51 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 24 12:55:51 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Nov 24 12:55:51 compute-1 ovs-ctl[53819]: Inserting openvswitch module [  OK  ]
Nov 24 12:55:52 compute-1 ovs-ctl[53788]: Starting ovs-vswitchd [  OK  ]
Nov 24 12:55:52 compute-1 ovs-ctl[53788]: Enabling remote OVSDB managers [  OK  ]
Nov 24 12:55:52 compute-1 ovs-vsctl[53838]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 24 12:55:52 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 24 12:55:52 compute-1 systemd[1]: Starting Open vSwitch...
Nov 24 12:55:52 compute-1 systemd[1]: Finished Open vSwitch.
Nov 24 12:55:52 compute-1 sudo[53651]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:53 compute-1 python3.9[53989]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:55:53 compute-1 sudo[54139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrhmjblcaipnhhsggrnjbzodkrfrqnzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988953.3662932-185-44826653289471/AnsiballZ_sefcontext.py'
Nov 24 12:55:53 compute-1 sudo[54139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:54 compute-1 python3.9[54141]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 24 12:55:55 compute-1 kernel: SELinux:  Converting 2745 SID table entries...
Nov 24 12:55:55 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 12:55:55 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 24 12:55:55 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 12:55:55 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 24 12:55:55 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 12:55:55 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 12:55:55 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 12:55:55 compute-1 sudo[54139]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:56 compute-1 python3.9[54296]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:55:57 compute-1 sudo[54452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqgdemgokdapdoxogxbeicaljfwqawxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988956.7384584-221-36621646436942/AnsiballZ_dnf.py'
Nov 24 12:55:57 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 24 12:55:57 compute-1 sudo[54452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:57 compute-1 python3.9[54454]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:55:58 compute-1 sudo[54452]: pam_unix(sudo:session): session closed for user root
Nov 24 12:55:59 compute-1 sudo[54605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfwafvgbiaturzcrugpdcpylaagurhoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988958.8262422-237-10938627926163/AnsiballZ_command.py'
Nov 24 12:55:59 compute-1 sudo[54605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:55:59 compute-1 python3.9[54607]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:56:00 compute-1 sudo[54605]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:00 compute-1 sudo[54892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaiswalcbptqjzloflqljzjzhgtqmjkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988960.351058-253-214631136525916/AnsiballZ_file.py'
Nov 24 12:56:00 compute-1 sudo[54892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:00 compute-1 python3.9[54894]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 12:56:01 compute-1 sudo[54892]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:01 compute-1 python3.9[55044]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:56:02 compute-1 sudo[55196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnsnchtlrstphzchfbionpovoapytgre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988962.0180578-285-257434429466371/AnsiballZ_dnf.py'
Nov 24 12:56:02 compute-1 sudo[55196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:02 compute-1 python3.9[55198]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:56:04 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 12:56:04 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 24 12:56:04 compute-1 systemd[1]: Reloading.
Nov 24 12:56:04 compute-1 systemd-rc-local-generator[55236]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:56:04 compute-1 systemd-sysv-generator[55239]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:56:04 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 12:56:04 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 12:56:04 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 24 12:56:04 compute-1 systemd[1]: run-re742b096dc8845638c2cfa3cbd824c56.service: Deactivated successfully.
Nov 24 12:56:05 compute-1 sudo[55196]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:05 compute-1 sudo[55512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsorpbrzqyxjgbfwniszmqoncpafsyuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988965.2985675-301-28722291194072/AnsiballZ_systemd.py'
Nov 24 12:56:05 compute-1 sudo[55512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:05 compute-1 python3.9[55514]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 12:56:05 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 24 12:56:05 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Nov 24 12:56:05 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Nov 24 12:56:05 compute-1 systemd[1]: Stopping Network Manager...
Nov 24 12:56:05 compute-1 NetworkManager[7191]: <info>  [1763988965.9125] caught SIGTERM, shutting down normally.
Nov 24 12:56:05 compute-1 NetworkManager[7191]: <info>  [1763988965.9137] dhcp4 (eth0): canceled DHCP transaction
Nov 24 12:56:05 compute-1 NetworkManager[7191]: <info>  [1763988965.9137] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:56:05 compute-1 NetworkManager[7191]: <info>  [1763988965.9138] dhcp4 (eth0): state changed no lease
Nov 24 12:56:05 compute-1 NetworkManager[7191]: <info>  [1763988965.9139] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 12:56:05 compute-1 NetworkManager[7191]: <info>  [1763988965.9249] exiting (success)
Nov 24 12:56:05 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 12:56:05 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 12:56:05 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 24 12:56:05 compute-1 systemd[1]: Stopped Network Manager.
Nov 24 12:56:05 compute-1 systemd[1]: NetworkManager.service: Consumed 12.080s CPU time, 4.3M memory peak, read 0B from disk, written 17.0K to disk.
Nov 24 12:56:05 compute-1 systemd[1]: Starting Network Manager...
Nov 24 12:56:05 compute-1 NetworkManager[55527]: <info>  [1763988965.9761] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:519b3143-8899-40d3-b573-09a79f21923a)
Nov 24 12:56:05 compute-1 NetworkManager[55527]: <info>  [1763988965.9762] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 12:56:05 compute-1 NetworkManager[55527]: <info>  [1763988965.9812] manager[0x556342add090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 12:56:05 compute-1 systemd[1]: Starting Hostname Service...
Nov 24 12:56:06 compute-1 systemd[1]: Started Hostname Service.
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0407] hostname: hostname: using hostnamed
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0407] hostname: static hostname changed from (none) to "compute-1"
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0412] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0416] manager[0x556342add090]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0416] manager[0x556342add090]: rfkill: WWAN hardware radio set enabled
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0437] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0446] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0446] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0447] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0447] manager: Networking is enabled by state file
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0449] settings: Loaded settings plugin: keyfile (internal)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0453] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0475] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0486] dhcp: init: Using DHCP client 'internal'
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0489] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0493] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0498] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0504] device (lo): Activation: starting connection 'lo' (9d06024e-4e17-4e2e-898d-e229a91ed6b5)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0510] device (eth0): carrier: link connected
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0515] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0519] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0519] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0524] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0529] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0534] device (eth1): carrier: link connected
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0537] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0540] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (6d0d6144-7d1b-5062-bb60-a38a6f16cac3) (indicated)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0541] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0545] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0551] device (eth1): Activation: starting connection 'ci-private-network' (6d0d6144-7d1b-5062-bb60-a38a6f16cac3)
Nov 24 12:56:06 compute-1 systemd[1]: Started Network Manager.
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0560] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0566] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0569] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0570] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0573] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0576] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0579] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0581] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0585] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0592] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0595] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0632] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0643] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0649] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0651] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0654] device (lo): Activation: successful, device activated.
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0682] dhcp4 (eth0): state changed new lease, address=38.102.83.173
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0689] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 12:56:06 compute-1 systemd[1]: Starting Network Manager Wait Online...
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0736] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0740] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0745] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0749] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0751] device (eth1): Activation: successful, device activated.
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0777] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0779] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0782] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0785] device (eth0): Activation: successful, device activated.
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0792] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 12:56:06 compute-1 NetworkManager[55527]: <info>  [1763988966.0813] manager: startup complete
Nov 24 12:56:06 compute-1 sudo[55512]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:06 compute-1 systemd[1]: Finished Network Manager Wait Online.
Nov 24 12:56:06 compute-1 sudo[55738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgktgmmjhlpwsihkniklotgmryvvsasf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988966.37924-317-159487577189600/AnsiballZ_dnf.py'
Nov 24 12:56:06 compute-1 sudo[55738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:06 compute-1 python3.9[55740]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:56:12 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 12:56:12 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 24 12:56:12 compute-1 systemd[1]: Reloading.
Nov 24 12:56:12 compute-1 systemd-rc-local-generator[55795]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:56:12 compute-1 systemd-sysv-generator[55798]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:56:12 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 12:56:14 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 12:56:14 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 24 12:56:14 compute-1 systemd[1]: run-r1c74d297e6f44f4e936444381dc3fe6f.service: Deactivated successfully.
Nov 24 12:56:14 compute-1 sudo[55738]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:14 compute-1 sudo[56198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crpoqjtpyvzdwsyaksvyvzgjdzsscfdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988974.5598295-341-129986228722992/AnsiballZ_stat.py'
Nov 24 12:56:14 compute-1 sudo[56198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:15 compute-1 python3.9[56200]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:56:15 compute-1 sudo[56198]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:15 compute-1 sudo[56350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uholugvdazrjndmrchqgbggozkviuadt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988975.2640784-359-46164276348730/AnsiballZ_ini_file.py'
Nov 24 12:56:15 compute-1 sudo[56350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:15 compute-1 python3.9[56352]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:15 compute-1 sudo[56350]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:16 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 12:56:16 compute-1 sudo[56504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzoozgryaqvrpjiuhmxtzhxbrldorgti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988976.1941535-379-54622097073777/AnsiballZ_ini_file.py'
Nov 24 12:56:16 compute-1 sudo[56504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:16 compute-1 python3.9[56506]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:16 compute-1 sudo[56504]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:17 compute-1 sudo[56656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdllhtewmqjdjhyvzuvzplsbffholjdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988976.8202713-379-81133377304897/AnsiballZ_ini_file.py'
Nov 24 12:56:17 compute-1 sudo[56656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:17 compute-1 python3.9[56658]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:17 compute-1 sudo[56656]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:17 compute-1 sudo[56808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsbrvwdpckiiyypyecvpibftlwpqloer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988977.6483326-409-209027650251884/AnsiballZ_ini_file.py'
Nov 24 12:56:17 compute-1 sudo[56808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:18 compute-1 python3.9[56810]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:18 compute-1 sudo[56808]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:18 compute-1 sudo[56960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjedyxobddadfjyyluvkyhpkdbokupkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988978.2906044-409-99446831449205/AnsiballZ_ini_file.py'
Nov 24 12:56:18 compute-1 sudo[56960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:18 compute-1 python3.9[56962]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:18 compute-1 sudo[56960]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:19 compute-1 sudo[57112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddkzfvqjmwspuxqziixqovziejhixvbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988978.9421408-439-157982521344751/AnsiballZ_stat.py'
Nov 24 12:56:19 compute-1 sudo[57112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:19 compute-1 python3.9[57114]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:56:19 compute-1 sudo[57112]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:19 compute-1 sudo[57235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wawcvelngydwlluihojwsuepjjsbdere ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988978.9421408-439-157982521344751/AnsiballZ_copy.py'
Nov 24 12:56:19 compute-1 sudo[57235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:20 compute-1 python3.9[57237]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763988978.9421408-439-157982521344751/.source _original_basename=.7nr0s_a1 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:20 compute-1 sudo[57235]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:20 compute-1 sudo[57387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqfugxutctnealgounebgdvqnhizybof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988980.359104-469-242168660752226/AnsiballZ_file.py'
Nov 24 12:56:20 compute-1 sudo[57387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:20 compute-1 python3.9[57389]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:20 compute-1 sudo[57387]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:21 compute-1 sudo[57539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfbsbgrzrxttftwkyqgnwdbomxxpacpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988981.0765874-485-30604852512056/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 24 12:56:21 compute-1 sudo[57539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:21 compute-1 python3.9[57541]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 24 12:56:21 compute-1 sudo[57539]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:22 compute-1 sudo[57691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylaaooozirupgbvxaubpswxhybzycowr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988982.0168211-503-157392937968072/AnsiballZ_file.py'
Nov 24 12:56:22 compute-1 sudo[57691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:22 compute-1 python3.9[57693]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:22 compute-1 sudo[57691]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:23 compute-1 sudo[57843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phvhfttzrmwcjjjqjucvzokxaomuqvga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988982.8084276-523-63394776536270/AnsiballZ_stat.py'
Nov 24 12:56:23 compute-1 sudo[57843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:23 compute-1 sudo[57843]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:23 compute-1 sudo[57966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axgdlngrqidzkplwykobmtsjsaelursp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988982.8084276-523-63394776536270/AnsiballZ_copy.py'
Nov 24 12:56:23 compute-1 sudo[57966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:23 compute-1 sudo[57966]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:24 compute-1 sudo[58118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wedttlawiofwlzgftdnkjsxtyhzrhfxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988984.026634-553-217566427800932/AnsiballZ_slurp.py'
Nov 24 12:56:24 compute-1 sudo[58118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:24 compute-1 python3.9[58120]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 24 12:56:24 compute-1 sudo[58118]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:25 compute-1 sudo[58293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxzhwbdweohlpovxmvrqmurehugpgehe ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988984.9097645-571-274647932183620/async_wrapper.py j1676698561 300 /home/zuul/.ansible/tmp/ansible-tmp-1763988984.9097645-571-274647932183620/AnsiballZ_edpm_os_net_config.py _'
Nov 24 12:56:25 compute-1 sudo[58293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:25 compute-1 ansible-async_wrapper.py[58295]: Invoked with j1676698561 300 /home/zuul/.ansible/tmp/ansible-tmp-1763988984.9097645-571-274647932183620/AnsiballZ_edpm_os_net_config.py _
Nov 24 12:56:25 compute-1 ansible-async_wrapper.py[58298]: Starting module and watcher
Nov 24 12:56:25 compute-1 ansible-async_wrapper.py[58298]: Start watching 58299 (300)
Nov 24 12:56:25 compute-1 ansible-async_wrapper.py[58299]: Start module (58299)
Nov 24 12:56:25 compute-1 ansible-async_wrapper.py[58295]: Return async_wrapper task started.
Nov 24 12:56:25 compute-1 sudo[58293]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:25 compute-1 python3.9[58300]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 24 12:56:26 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 24 12:56:26 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 24 12:56:26 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 24 12:56:26 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 24 12:56:26 compute-1 kernel: cfg80211: failed to load regulatory.db
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4360] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4375] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4810] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4811] audit: op="connection-add" uuid="1d8f5b78-2714-4e6b-8e0e-dcdf8f3acb18" name="br-ex-br" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4823] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4824] audit: op="connection-add" uuid="f82519a1-ba06-44ee-91bd-3d9aac885c0a" name="br-ex-port" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4833] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4834] audit: op="connection-add" uuid="565db9ce-9193-49f5-9fee-f882c5d4a7b5" name="eth1-port" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4844] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4845] audit: op="connection-add" uuid="42229795-e6a8-49a3-8513-0557d4410855" name="vlan20-port" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4854] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4855] audit: op="connection-add" uuid="d18f9b32-8902-4d86-b386-25f11710a11b" name="vlan21-port" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4864] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4866] audit: op="connection-add" uuid="19858ce7-4239-4afe-a98b-e60c6d1f3ecf" name="vlan22-port" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4883] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,connection.timestamp,connection.autoconnect-priority" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4896] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4896] audit: op="connection-add" uuid="e69c1f14-6381-4269-9584-409a91ead762" name="br-ex-if" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4939] audit: op="connection-update" uuid="6d0d6144-7d1b-5062-bb60-a38a6f16cac3" name="ci-private-network" args="ipv4.routes,ipv4.addresses,ipv4.dns,ipv4.method,ipv4.never-default,ipv4.routing-rules,ipv6.addr-gen-mode,ipv6.addresses,ipv6.dns,ipv6.method,ipv6.routing-rules,ipv6.routes,ovs-external-ids.data,ovs-interface.type,connection.slave-type,connection.master,connection.timestamp,connection.controller,connection.port-type" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4953] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4954] audit: op="connection-add" uuid="152f9b88-c5ef-41b6-a9bd-647ce88c9e9f" name="vlan20-if" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4967] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4968] audit: op="connection-add" uuid="898a4b7f-6bd0-46e7-a392-f2c1e48f27d1" name="vlan21-if" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4981] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4982] audit: op="connection-add" uuid="3755a174-762d-4026-9023-ce1528c7b666" name="vlan22-if" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.4991] audit: op="connection-delete" uuid="8bc818fe-af80-3cb7-9f9f-3a727bee475a" name="Wired connection 1" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5000] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5007] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5010] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (1d8f5b78-2714-4e6b-8e0e-dcdf8f3acb18)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5010] audit: op="connection-activate" uuid="1d8f5b78-2714-4e6b-8e0e-dcdf8f3acb18" name="br-ex-br" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5012] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5016] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5019] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (f82519a1-ba06-44ee-91bd-3d9aac885c0a)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5021] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5025] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5027] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (565db9ce-9193-49f5-9fee-f882c5d4a7b5)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5029] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5033] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5036] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (42229795-e6a8-49a3-8513-0557d4410855)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5038] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5043] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5046] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (d18f9b32-8902-4d86-b386-25f11710a11b)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5047] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5052] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5056] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (19858ce7-4239-4afe-a98b-e60c6d1f3ecf)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5056] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5058] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5059] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5063] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5066] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5069] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e69c1f14-6381-4269-9584-409a91ead762)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5070] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5072] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5073] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5073] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5074] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5082] device (eth1): disconnecting for new activation request.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5083] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5085] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5087] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5087] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5090] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5094] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5097] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (152f9b88-c5ef-41b6-a9bd-647ce88c9e9f)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5098] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5101] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5102] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5103] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5106] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5110] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5113] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (898a4b7f-6bd0-46e7-a392-f2c1e48f27d1)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5114] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5117] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5119] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5120] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5122] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5126] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5129] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (3755a174-762d-4026-9023-ce1528c7b666)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5130] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5132] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5134] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5135] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5136] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5145] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5146] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5148] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5149] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5159] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5162] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 kernel: ovs-system: entered promiscuous mode
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5168] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5170] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5172] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5176] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5181] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 systemd-udevd[58306]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 12:56:27 compute-1 kernel: Timeout policy base is empty
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5183] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5187] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5192] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5195] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5197] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5199] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5203] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5205] dhcp4 (eth0): canceled DHCP transaction
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5206] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5206] dhcp4 (eth0): state changed no lease
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5207] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5216] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5224] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58301 uid=0 result="fail" reason="Device is not activated"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5227] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5234] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 24 12:56:27 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5242] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5267] device (eth1): disconnecting for new activation request.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5268] audit: op="connection-activate" uuid="6d0d6144-7d1b-5062-bb60-a38a6f16cac3" name="ci-private-network" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5270] dhcp4 (eth0): state changed new lease, address=38.102.83.173
Nov 24 12:56:27 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5354] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5382] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5506] device (eth1): Activation: starting connection 'ci-private-network' (6d0d6144-7d1b-5062-bb60-a38a6f16cac3)
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5511] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5518] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5521] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5526] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5530] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5535] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5536] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5537] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5539] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5540] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5547] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5557] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5561] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5563] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5566] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5568] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5571] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5573] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5576] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5579] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5581] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5585] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 24 12:56:27 compute-1 kernel: br-ex: entered promiscuous mode
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5589] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5628] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5631] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5635] device (eth1): Activation: successful, device activated.
Nov 24 12:56:27 compute-1 kernel: vlan22: entered promiscuous mode
Nov 24 12:56:27 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5731] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 24 12:56:27 compute-1 systemd-udevd[58307]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5741] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 kernel: vlan20: entered promiscuous mode
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5804] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5806] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5810] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 kernel: vlan21: entered promiscuous mode
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5848] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5869] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5901] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5913] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5922] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5923] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5929] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5940] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5945] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5949] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5955] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.5970] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.6005] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.6006] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 12:56:27 compute-1 NetworkManager[55527]: <info>  [1763988987.6013] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 12:56:28 compute-1 NetworkManager[55527]: <info>  [1763988988.7159] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Nov 24 12:56:28 compute-1 NetworkManager[55527]: <info>  [1763988988.8798] checkpoint[0x556342ab4950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 24 12:56:28 compute-1 NetworkManager[55527]: <info>  [1763988988.8800] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Nov 24 12:56:29 compute-1 NetworkManager[55527]: <info>  [1763988989.0866] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58301 uid=0 result="success"
Nov 24 12:56:29 compute-1 NetworkManager[55527]: <info>  [1763988989.0877] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58301 uid=0 result="success"
Nov 24 12:56:29 compute-1 NetworkManager[55527]: <info>  [1763988989.2378] audit: op="networking-control" arg="global-dns-configuration" pid=58301 uid=0 result="success"
Nov 24 12:56:29 compute-1 NetworkManager[55527]: <info>  [1763988989.2401] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 24 12:56:29 compute-1 NetworkManager[55527]: <info>  [1763988989.2423] audit: op="networking-control" arg="global-dns-configuration" pid=58301 uid=0 result="success"
Nov 24 12:56:29 compute-1 NetworkManager[55527]: <info>  [1763988989.2445] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58301 uid=0 result="success"
Nov 24 12:56:29 compute-1 sudo[58637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkmpfwlnjxtwsklwualtpjbmsgkrsubn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988988.8447735-571-63165077224006/AnsiballZ_async_status.py'
Nov 24 12:56:29 compute-1 sudo[58637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:29 compute-1 NetworkManager[55527]: <info>  [1763988989.3591] checkpoint[0x556342ab4a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 24 12:56:29 compute-1 NetworkManager[55527]: <info>  [1763988989.3596] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58301 uid=0 result="success"
Nov 24 12:56:29 compute-1 ansible-async_wrapper.py[58299]: Module complete (58299)
Nov 24 12:56:29 compute-1 python3.9[58639]: ansible-ansible.legacy.async_status Invoked with jid=j1676698561.58295 mode=status _async_dir=/root/.ansible_async
Nov 24 12:56:29 compute-1 sudo[58637]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:29 compute-1 sudo[58736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwbarcryojmiccynsncdpxlvfnanqbop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988988.8447735-571-63165077224006/AnsiballZ_async_status.py'
Nov 24 12:56:29 compute-1 sudo[58736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:29 compute-1 python3.9[58738]: ansible-ansible.legacy.async_status Invoked with jid=j1676698561.58295 mode=cleanup _async_dir=/root/.ansible_async
Nov 24 12:56:29 compute-1 sudo[58736]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:30 compute-1 ansible-async_wrapper.py[58298]: Done in kid B.
Nov 24 12:56:33 compute-1 sudo[58891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqmgbmumovuabtuzmjevtlpxldclrfmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988993.702135-620-188873844368427/AnsiballZ_stat.py'
Nov 24 12:56:33 compute-1 sudo[58891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:34 compute-1 python3.9[58893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:56:34 compute-1 sudo[58891]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:34 compute-1 sudo[59014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svgcaxpdvyaoddiywzjitbvqxytmbwns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988993.702135-620-188873844368427/AnsiballZ_copy.py'
Nov 24 12:56:34 compute-1 sudo[59014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:34 compute-1 python3.9[59016]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763988993.702135-620-188873844368427/.source.returncode _original_basename=.0w4_r8fe follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:34 compute-1 sudo[59014]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:35 compute-1 sudo[59166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pywnbkohwhkfsstworeilimijyfuhwek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988994.9049304-652-243765605778536/AnsiballZ_stat.py'
Nov 24 12:56:35 compute-1 sudo[59166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:35 compute-1 python3.9[59168]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:56:35 compute-1 sudo[59166]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:35 compute-1 sudo[59289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnjysxfbqyrypdysyykifytjjpogknpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988994.9049304-652-243765605778536/AnsiballZ_copy.py'
Nov 24 12:56:35 compute-1 sudo[59289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:35 compute-1 python3.9[59291]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763988994.9049304-652-243765605778536/.source.cfg _original_basename=.nlyr3sw4 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:35 compute-1 sudo[59289]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:36 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 12:56:36 compute-1 sudo[59444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvwltjntrcjbbyvzdnttckrxhssejhlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763988996.1063054-682-264046130906761/AnsiballZ_systemd.py'
Nov 24 12:56:36 compute-1 sudo[59444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:36 compute-1 python3.9[59446]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 12:56:36 compute-1 systemd[1]: Reloading Network Manager...
Nov 24 12:56:36 compute-1 NetworkManager[55527]: <info>  [1763988996.7430] audit: op="reload" arg="0" pid=59450 uid=0 result="success"
Nov 24 12:56:36 compute-1 NetworkManager[55527]: <info>  [1763988996.7435] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 24 12:56:36 compute-1 systemd[1]: Reloaded Network Manager.
Nov 24 12:56:36 compute-1 sudo[59444]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:37 compute-1 sshd-session[51532]: Connection closed by 192.168.122.30 port 46820
Nov 24 12:56:37 compute-1 sshd-session[51529]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:56:37 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Nov 24 12:56:37 compute-1 systemd[1]: session-13.scope: Consumed 46.479s CPU time.
Nov 24 12:56:37 compute-1 systemd-logind[815]: Session 13 logged out. Waiting for processes to exit.
Nov 24 12:56:37 compute-1 systemd-logind[815]: Removed session 13.
Nov 24 12:56:42 compute-1 sshd-session[59480]: Accepted publickey for zuul from 192.168.122.30 port 59760 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:56:42 compute-1 systemd-logind[815]: New session 14 of user zuul.
Nov 24 12:56:42 compute-1 systemd[1]: Started Session 14 of User zuul.
Nov 24 12:56:42 compute-1 sshd-session[59480]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:56:43 compute-1 python3.9[59634]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:56:44 compute-1 python3.9[59788]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:56:45 compute-1 python3.9[59977]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:56:45 compute-1 sshd-session[59483]: Connection closed by 192.168.122.30 port 59760
Nov 24 12:56:45 compute-1 sshd-session[59480]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:56:45 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Nov 24 12:56:45 compute-1 systemd[1]: session-14.scope: Consumed 2.167s CPU time.
Nov 24 12:56:45 compute-1 systemd-logind[815]: Session 14 logged out. Waiting for processes to exit.
Nov 24 12:56:45 compute-1 systemd-logind[815]: Removed session 14.
Nov 24 12:56:46 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 12:56:51 compute-1 sshd-session[60007]: Accepted publickey for zuul from 192.168.122.30 port 43702 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:56:51 compute-1 systemd-logind[815]: New session 15 of user zuul.
Nov 24 12:56:51 compute-1 systemd[1]: Started Session 15 of User zuul.
Nov 24 12:56:51 compute-1 sshd-session[60007]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:56:52 compute-1 python3.9[60161]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:56:53 compute-1 python3.9[60315]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:56:54 compute-1 sudo[60469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjujgxwxhropotesipehdykvbqvenizd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989014.0769615-61-269786186495238/AnsiballZ_setup.py'
Nov 24 12:56:54 compute-1 sudo[60469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:54 compute-1 python3.9[60471]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:56:54 compute-1 sudo[60469]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:55 compute-1 sudo[60553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siklfhraskaetlsiqrspowcgqpsvbruz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989014.0769615-61-269786186495238/AnsiballZ_dnf.py'
Nov 24 12:56:55 compute-1 sudo[60553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:55 compute-1 python3.9[60555]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:56:56 compute-1 sudo[60553]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:57 compute-1 sudo[60707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-denpoqugqsfsljymagqxeuhkkycbmhhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989017.0879838-85-225493708200848/AnsiballZ_setup.py'
Nov 24 12:56:57 compute-1 sudo[60707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:57 compute-1 python3.9[60709]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:56:57 compute-1 sshd-session[60710]: Invalid user solana from 45.148.10.240 port 47386
Nov 24 12:56:58 compute-1 sudo[60707]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:58 compute-1 sshd-session[60710]: Connection closed by invalid user solana 45.148.10.240 port 47386 [preauth]
Nov 24 12:56:58 compute-1 sudo[60901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfthifrzbbyjlhxomkkxpindxjscauzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989018.273085-107-128906400303614/AnsiballZ_file.py'
Nov 24 12:56:58 compute-1 sudo[60901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:58 compute-1 python3.9[60903]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:56:58 compute-1 sudo[60901]: pam_unix(sudo:session): session closed for user root
Nov 24 12:56:59 compute-1 sudo[61053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fntnmtnxkfuiiboqgjwglotgghgmtear ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989019.1220405-123-107719009616004/AnsiballZ_command.py'
Nov 24 12:56:59 compute-1 sudo[61053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:56:59 compute-1 python3.9[61055]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:56:59 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 12:56:59 compute-1 sudo[61053]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:00 compute-1 sudo[61217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlftuyajkixhdvjrejssaparlijlisbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989020.0725741-139-250503383156030/AnsiballZ_stat.py'
Nov 24 12:57:00 compute-1 sudo[61217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:00 compute-1 python3.9[61219]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:00 compute-1 sudo[61217]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:01 compute-1 sudo[61295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftrgjtpapwnmggvnlihcqowpoosewtbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989020.0725741-139-250503383156030/AnsiballZ_file.py'
Nov 24 12:57:01 compute-1 sudo[61295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:01 compute-1 python3.9[61297]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:01 compute-1 sudo[61295]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:01 compute-1 sudo[61447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwemsribljymqyuwtzcprcqrvvdwxzvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989021.453605-163-49375058957462/AnsiballZ_stat.py'
Nov 24 12:57:01 compute-1 sudo[61447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:01 compute-1 python3.9[61449]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:01 compute-1 sudo[61447]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:02 compute-1 sudo[61525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgqzcutsbuvvxijzmfiydfrdeurppusa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989021.453605-163-49375058957462/AnsiballZ_file.py'
Nov 24 12:57:02 compute-1 sudo[61525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:02 compute-1 python3.9[61527]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:57:02 compute-1 sudo[61525]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:03 compute-1 sudo[61677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjjkpvzbgupkdvipcovrrmlvgjdxgeok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989022.7215927-189-14649621401096/AnsiballZ_ini_file.py'
Nov 24 12:57:03 compute-1 sudo[61677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:03 compute-1 python3.9[61679]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:57:03 compute-1 sudo[61677]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:03 compute-1 sudo[61829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrskwnwmcgdakykffbftpinvxtfmbhfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989023.5216599-189-147767906220069/AnsiballZ_ini_file.py'
Nov 24 12:57:03 compute-1 sudo[61829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:03 compute-1 python3.9[61831]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:57:03 compute-1 sudo[61829]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:04 compute-1 sudo[61981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mildzkwihswfsylobpqakqjyjsvxxvqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989024.098557-189-20530830031086/AnsiballZ_ini_file.py'
Nov 24 12:57:04 compute-1 sudo[61981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:04 compute-1 python3.9[61983]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:57:04 compute-1 sudo[61981]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:04 compute-1 sudo[62133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxklcarnmookdagwmlbqgondwgiooghn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989024.6711762-189-33986727445298/AnsiballZ_ini_file.py'
Nov 24 12:57:04 compute-1 sudo[62133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:05 compute-1 python3.9[62135]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:57:05 compute-1 sudo[62133]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:05 compute-1 sudo[62285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-virnzlrmhthznlheppjnxngenqxmhnfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989025.6214283-251-230555259616918/AnsiballZ_dnf.py'
Nov 24 12:57:05 compute-1 sudo[62285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:06 compute-1 python3.9[62287]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:57:07 compute-1 sudo[62285]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:08 compute-1 sudo[62438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcqgxvewezfwbrkercwjgnnimxualhnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989027.912489-273-229412514661892/AnsiballZ_setup.py'
Nov 24 12:57:08 compute-1 sudo[62438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:08 compute-1 python3.9[62440]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:57:08 compute-1 sudo[62438]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:09 compute-1 sudo[62592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhqudlfbhdhbdovhamibwwutwtkelulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989028.753911-289-74535779426727/AnsiballZ_stat.py'
Nov 24 12:57:09 compute-1 sudo[62592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:09 compute-1 python3.9[62594]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:57:09 compute-1 sudo[62592]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:09 compute-1 sudo[62744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lteswdjeimyddzfmfwthjqqiyeuibalr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989029.4683402-307-158880673673948/AnsiballZ_stat.py'
Nov 24 12:57:09 compute-1 sudo[62744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:09 compute-1 python3.9[62746]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:57:09 compute-1 sudo[62744]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:10 compute-1 sudo[62896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdxhanjxgrlhqsksxfplhgcrnbhyhelr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989030.316216-327-247519264374534/AnsiballZ_command.py'
Nov 24 12:57:10 compute-1 sudo[62896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:10 compute-1 python3.9[62898]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:57:10 compute-1 sudo[62896]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:11 compute-1 sudo[63049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzamgpekjvvmeyrllgijypoelnxlppjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989031.05238-347-93939293142468/AnsiballZ_service_facts.py'
Nov 24 12:57:11 compute-1 sudo[63049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:11 compute-1 python3.9[63051]: ansible-service_facts Invoked
Nov 24 12:57:11 compute-1 network[63068]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 12:57:11 compute-1 network[63069]: 'network-scripts' will be removed from distribution in near future.
Nov 24 12:57:11 compute-1 network[63070]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 12:57:16 compute-1 sudo[63049]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:17 compute-1 sudo[63353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwggstnmtrtfslzmwbpiwkbkspdvgkth ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1763989036.84514-377-240637576463136/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1763989036.84514-377-240637576463136/args'
Nov 24 12:57:17 compute-1 sudo[63353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:17 compute-1 sudo[63353]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:17 compute-1 sudo[63520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhyprdwxijwccibpqtwpvmspjiswskvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989037.5794878-399-208713481735007/AnsiballZ_dnf.py'
Nov 24 12:57:17 compute-1 sudo[63520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:18 compute-1 python3.9[63522]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 12:57:19 compute-1 sudo[63520]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:20 compute-1 sudo[63673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeqqphphaynwfusvunmkgyvmdtksnelf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989040.022888-425-53619812699476/AnsiballZ_package_facts.py'
Nov 24 12:57:20 compute-1 sudo[63673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:21 compute-1 python3.9[63675]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 24 12:57:21 compute-1 sudo[63673]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:22 compute-1 sudo[63825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogdmjvbibtmnmjghhtgtkbdzoufdohgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989042.1451452-445-261061023293199/AnsiballZ_stat.py'
Nov 24 12:57:22 compute-1 sudo[63825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:22 compute-1 python3.9[63827]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:22 compute-1 sudo[63825]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:23 compute-1 sudo[63950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tukbxgefnkvyqxrwfytgqmgkqonmhhcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989042.1451452-445-261061023293199/AnsiballZ_copy.py'
Nov 24 12:57:23 compute-1 sudo[63950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:23 compute-1 python3.9[63952]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989042.1451452-445-261061023293199/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:23 compute-1 sudo[63950]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:23 compute-1 sudo[64104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgqkvvxrkbpyarbalantrjswmckujnrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989043.6537907-476-95574960981076/AnsiballZ_stat.py'
Nov 24 12:57:23 compute-1 sudo[64104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:24 compute-1 python3.9[64106]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:24 compute-1 sudo[64104]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:24 compute-1 sudo[64229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfrrszqpzzsjwskbcdqujzanthtqgkxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989043.6537907-476-95574960981076/AnsiballZ_copy.py'
Nov 24 12:57:24 compute-1 sudo[64229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:24 compute-1 python3.9[64231]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989043.6537907-476-95574960981076/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:24 compute-1 sudo[64229]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:26 compute-1 sudo[64383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whwpearuspbqinpudpynlkhskbyorzou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989045.6536934-517-213973271870569/AnsiballZ_lineinfile.py'
Nov 24 12:57:26 compute-1 sudo[64383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:26 compute-1 python3.9[64385]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:26 compute-1 sudo[64383]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:27 compute-1 sudo[64537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygshzgewdoawftpdcjvdasxulkjorpvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989047.2031274-547-171467183038004/AnsiballZ_setup.py'
Nov 24 12:57:27 compute-1 sudo[64537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:27 compute-1 python3.9[64539]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:57:28 compute-1 sudo[64537]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:28 compute-1 sudo[64621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvjydrwgwchdgpsbwivzvqlmiqqctsak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989047.2031274-547-171467183038004/AnsiballZ_systemd.py'
Nov 24 12:57:28 compute-1 sudo[64621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:28 compute-1 python3.9[64623]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:57:28 compute-1 sudo[64621]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:29 compute-1 sudo[64775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkgyuallrzzsfsdedwyjjkcgxzufitou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989049.7054517-579-79804933206736/AnsiballZ_setup.py'
Nov 24 12:57:29 compute-1 sudo[64775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:30 compute-1 python3.9[64777]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:57:30 compute-1 sudo[64775]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:30 compute-1 sudo[64859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyihjvbotdrwjaclkfhuruodewpfjous ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989049.7054517-579-79804933206736/AnsiballZ_systemd.py'
Nov 24 12:57:30 compute-1 sudo[64859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:30 compute-1 python3.9[64861]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 12:57:30 compute-1 chronyd[790]: chronyd exiting
Nov 24 12:57:30 compute-1 systemd[1]: Stopping NTP client/server...
Nov 24 12:57:30 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Nov 24 12:57:30 compute-1 systemd[1]: Stopped NTP client/server.
Nov 24 12:57:30 compute-1 systemd[1]: Starting NTP client/server...
Nov 24 12:57:31 compute-1 chronyd[64869]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 24 12:57:31 compute-1 chronyd[64869]: Frequency -26.621 +/- 0.283 ppm read from /var/lib/chrony/drift
Nov 24 12:57:31 compute-1 chronyd[64869]: Loaded seccomp filter (level 2)
Nov 24 12:57:31 compute-1 systemd[1]: Started NTP client/server.
Nov 24 12:57:31 compute-1 sudo[64859]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:31 compute-1 sshd-session[60010]: Connection closed by 192.168.122.30 port 43702
Nov 24 12:57:31 compute-1 sshd-session[60007]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:57:31 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Nov 24 12:57:31 compute-1 systemd[1]: session-15.scope: Consumed 24.228s CPU time.
Nov 24 12:57:31 compute-1 systemd-logind[815]: Session 15 logged out. Waiting for processes to exit.
Nov 24 12:57:31 compute-1 systemd-logind[815]: Removed session 15.
Nov 24 12:57:37 compute-1 sshd-session[64895]: Accepted publickey for zuul from 192.168.122.30 port 57262 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:57:37 compute-1 systemd-logind[815]: New session 16 of user zuul.
Nov 24 12:57:37 compute-1 systemd[1]: Started Session 16 of User zuul.
Nov 24 12:57:37 compute-1 sshd-session[64895]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:57:38 compute-1 python3.9[65048]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:57:39 compute-1 sudo[65202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wavplhhfvkqzllwqfhgesqtvlgmoppmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989058.9219027-47-94192458179360/AnsiballZ_file.py'
Nov 24 12:57:39 compute-1 sudo[65202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:39 compute-1 python3.9[65204]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:39 compute-1 sudo[65202]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:40 compute-1 sudo[65377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsiojebuvzrdghohuwknqfekzrdkjxnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989059.8791325-63-110644405261234/AnsiballZ_stat.py'
Nov 24 12:57:40 compute-1 sudo[65377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:40 compute-1 python3.9[65379]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:40 compute-1 sudo[65377]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:40 compute-1 sudo[65455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juzfbtswnkcqfsrqhhtywalyjmvflaum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989059.8791325-63-110644405261234/AnsiballZ_file.py'
Nov 24 12:57:40 compute-1 sudo[65455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:41 compute-1 python3.9[65457]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.t_rkf6zn recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:41 compute-1 sudo[65455]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:41 compute-1 sudo[65607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kogllrqqmkjpgbqtldnyouiqshzqkhbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989061.583117-103-162573300599441/AnsiballZ_stat.py'
Nov 24 12:57:41 compute-1 sudo[65607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:42 compute-1 python3.9[65609]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:42 compute-1 sudo[65607]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:42 compute-1 sudo[65730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssaaidgjofliwdaszgwpnmoubsbdcdvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989061.583117-103-162573300599441/AnsiballZ_copy.py'
Nov 24 12:57:42 compute-1 sudo[65730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:42 compute-1 python3.9[65732]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989061.583117-103-162573300599441/.source _original_basename=.esk3umbu follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:42 compute-1 sudo[65730]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:43 compute-1 sudo[65882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etbsrilvsjqgmdlmlctyycskpjwbfzih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989063.079189-135-190734703415014/AnsiballZ_file.py'
Nov 24 12:57:43 compute-1 sudo[65882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:43 compute-1 python3.9[65884]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:57:43 compute-1 sudo[65882]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:44 compute-1 sudo[66034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcveyakszswrmrxypperlnagdlhzcysp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989063.8031087-151-14749917157709/AnsiballZ_stat.py'
Nov 24 12:57:44 compute-1 sudo[66034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:44 compute-1 python3.9[66036]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:44 compute-1 sudo[66034]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:44 compute-1 sudo[66157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfsdtycsjykcwqdppjpczkdqpsavrggr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989063.8031087-151-14749917157709/AnsiballZ_copy.py'
Nov 24 12:57:44 compute-1 sudo[66157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:44 compute-1 python3.9[66159]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989063.8031087-151-14749917157709/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:57:44 compute-1 sudo[66157]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:45 compute-1 sudo[66309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bosfregskfsgppbqsytuiocrajjyfqvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989064.9378161-151-238356386618891/AnsiballZ_stat.py'
Nov 24 12:57:45 compute-1 sudo[66309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:45 compute-1 python3.9[66311]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:45 compute-1 sudo[66309]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:45 compute-1 sudo[66432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihiqlsfxfkecgqrcamhockvzsufmnina ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989064.9378161-151-238356386618891/AnsiballZ_copy.py'
Nov 24 12:57:45 compute-1 sudo[66432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:45 compute-1 python3.9[66434]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989064.9378161-151-238356386618891/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:57:46 compute-1 sudo[66432]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:46 compute-1 sudo[66584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlxwlsdkqkvopfcvcaaxpxrygalniswi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989066.4759524-209-57347804574275/AnsiballZ_file.py'
Nov 24 12:57:46 compute-1 sudo[66584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:46 compute-1 python3.9[66586]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:46 compute-1 sudo[66584]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:47 compute-1 sudo[66736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iazgpcuslozwmbrkhzuefiedngjglmpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989067.2220027-225-208339849184385/AnsiballZ_stat.py'
Nov 24 12:57:47 compute-1 sudo[66736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:47 compute-1 python3.9[66738]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:47 compute-1 sudo[66736]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:48 compute-1 sudo[66859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjmwmmfifbmwkwsamkzkovowtmtjeaht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989067.2220027-225-208339849184385/AnsiballZ_copy.py'
Nov 24 12:57:48 compute-1 sudo[66859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:48 compute-1 python3.9[66861]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989067.2220027-225-208339849184385/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:48 compute-1 sudo[66859]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:48 compute-1 sudo[67011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jueblnmxpyqkzpshnihkevxtcwyqxkap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989068.6173813-255-56161229666274/AnsiballZ_stat.py'
Nov 24 12:57:48 compute-1 sudo[67011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:49 compute-1 python3.9[67013]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:49 compute-1 sudo[67011]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:49 compute-1 sudo[67134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gepnpyckmrfnumbiiwmxqwjpgeupmzoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989068.6173813-255-56161229666274/AnsiballZ_copy.py'
Nov 24 12:57:49 compute-1 sudo[67134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:49 compute-1 python3.9[67136]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989068.6173813-255-56161229666274/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:49 compute-1 sudo[67134]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:50 compute-1 sudo[67286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpgmcjyhgafzomhrbaheynadhqvfyvfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989069.8314803-285-165208050802816/AnsiballZ_systemd.py'
Nov 24 12:57:50 compute-1 sudo[67286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:50 compute-1 python3.9[67288]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:57:50 compute-1 systemd[1]: Reloading.
Nov 24 12:57:50 compute-1 systemd-rc-local-generator[67315]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:57:50 compute-1 systemd-sysv-generator[67319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:57:50 compute-1 systemd[1]: Reloading.
Nov 24 12:57:51 compute-1 systemd-rc-local-generator[67352]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:57:51 compute-1 systemd-sysv-generator[67356]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:57:51 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Nov 24 12:57:51 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Nov 24 12:57:51 compute-1 sudo[67286]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:51 compute-1 sudo[67512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rudwzyjrwqhhsmolecmsaufelrwchiqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989071.5873392-301-86863490649017/AnsiballZ_stat.py'
Nov 24 12:57:51 compute-1 sudo[67512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:52 compute-1 python3.9[67514]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:52 compute-1 sudo[67512]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:52 compute-1 sudo[67635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwrmcceublwpyqrdktfvybwihxkrwdvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989071.5873392-301-86863490649017/AnsiballZ_copy.py'
Nov 24 12:57:52 compute-1 sudo[67635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:52 compute-1 python3.9[67637]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989071.5873392-301-86863490649017/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:52 compute-1 sudo[67635]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:53 compute-1 sudo[67787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugnbafweuhlkpepqcbcptjdkkeunfrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989072.9259884-331-81363880227365/AnsiballZ_stat.py'
Nov 24 12:57:53 compute-1 sudo[67787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:53 compute-1 python3.9[67789]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:57:53 compute-1 sudo[67787]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:53 compute-1 sudo[67910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irtysgxenfbvqtlxkyyhcpmaompylald ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989072.9259884-331-81363880227365/AnsiballZ_copy.py'
Nov 24 12:57:53 compute-1 sudo[67910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:54 compute-1 python3.9[67912]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989072.9259884-331-81363880227365/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:57:54 compute-1 sudo[67910]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:54 compute-1 sudo[68062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtusbecxofbdzlccdlcrsxiptwzaacjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989074.3045182-361-139906700587656/AnsiballZ_systemd.py'
Nov 24 12:57:54 compute-1 sudo[68062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:57:54 compute-1 python3.9[68064]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:57:54 compute-1 systemd[1]: Reloading.
Nov 24 12:57:54 compute-1 systemd-rc-local-generator[68091]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:57:54 compute-1 systemd-sysv-generator[68095]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:57:55 compute-1 systemd[1]: Reloading.
Nov 24 12:57:55 compute-1 systemd-rc-local-generator[68131]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:57:55 compute-1 systemd-sysv-generator[68135]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:57:55 compute-1 systemd[1]: Starting Create netns directory...
Nov 24 12:57:55 compute-1 sshd-session[68065]: Invalid user ubuntu from 193.32.162.145 port 46132
Nov 24 12:57:55 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 12:57:55 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 12:57:55 compute-1 systemd[1]: Finished Create netns directory.
Nov 24 12:57:55 compute-1 sudo[68062]: pam_unix(sudo:session): session closed for user root
Nov 24 12:57:55 compute-1 sshd-session[68065]: Connection closed by invalid user ubuntu 193.32.162.145 port 46132 [preauth]
Nov 24 12:57:56 compute-1 python3.9[68293]: ansible-ansible.builtin.service_facts Invoked
Nov 24 12:57:56 compute-1 network[68310]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 12:57:56 compute-1 network[68311]: 'network-scripts' will be removed from distribution in near future.
Nov 24 12:57:56 compute-1 network[68312]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 12:58:02 compute-1 sudo[68573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdsogzrdcvfsgrttsysuhthagkpynuzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989082.2410018-393-245758553777584/AnsiballZ_systemd.py'
Nov 24 12:58:02 compute-1 sudo[68573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:02 compute-1 python3.9[68575]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:58:02 compute-1 systemd[1]: Reloading.
Nov 24 12:58:02 compute-1 systemd-rc-local-generator[68606]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:58:02 compute-1 systemd-sysv-generator[68611]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:58:03 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 24 12:58:03 compute-1 iptables.init[68615]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 24 12:58:03 compute-1 iptables.init[68615]: iptables: Flushing firewall rules: [  OK  ]
Nov 24 12:58:03 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Nov 24 12:58:03 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 24 12:58:03 compute-1 sudo[68573]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:03 compute-1 sudo[68811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiympczlznkhldptlhfmbopomfcolrdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989083.578674-393-86496480762642/AnsiballZ_systemd.py'
Nov 24 12:58:03 compute-1 sudo[68811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:04 compute-1 python3.9[68813]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:58:04 compute-1 sudo[68811]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:04 compute-1 sshd-session[68447]: Invalid user user from 185.156.73.233 port 18820
Nov 24 12:58:04 compute-1 sshd-session[68447]: Connection closed by invalid user user 185.156.73.233 port 18820 [preauth]
Nov 24 12:58:04 compute-1 sudo[68965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wptcbpsroifwojfealmswierygzglnnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989084.6737647-425-60129538262058/AnsiballZ_systemd.py'
Nov 24 12:58:04 compute-1 sudo[68965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:05 compute-1 python3.9[68967]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 12:58:05 compute-1 systemd[1]: Reloading.
Nov 24 12:58:05 compute-1 systemd-rc-local-generator[68996]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 12:58:05 compute-1 systemd-sysv-generator[69000]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 12:58:05 compute-1 systemd[1]: Starting Netfilter Tables...
Nov 24 12:58:05 compute-1 systemd[1]: Finished Netfilter Tables.
Nov 24 12:58:05 compute-1 sudo[68965]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:06 compute-1 sudo[69157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhcsblvlgrslvxaogtfqzkllfwhwzwow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989085.9345815-441-104792850293093/AnsiballZ_command.py'
Nov 24 12:58:06 compute-1 sudo[69157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:06 compute-1 python3.9[69159]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:58:06 compute-1 sudo[69157]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:07 compute-1 sudo[69310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lygvjrhpxhktghmobjszaaebycctxkrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989087.134787-469-83693949302660/AnsiballZ_stat.py'
Nov 24 12:58:07 compute-1 sudo[69310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:07 compute-1 python3.9[69312]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:07 compute-1 sudo[69310]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:08 compute-1 sudo[69435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utmznnjgllkvwmlyxvadqzfxgruqibly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989087.134787-469-83693949302660/AnsiballZ_copy.py'
Nov 24 12:58:08 compute-1 sudo[69435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:08 compute-1 python3.9[69437]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989087.134787-469-83693949302660/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:08 compute-1 sudo[69435]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:08 compute-1 sudo[69588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjmdoogwifdrpvtqtitttnitffwavfyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989088.6287365-499-74167945601524/AnsiballZ_systemd.py'
Nov 24 12:58:08 compute-1 sudo[69588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:09 compute-1 python3.9[69590]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 12:58:09 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Nov 24 12:58:09 compute-1 sshd[1007]: Received SIGHUP; restarting.
Nov 24 12:58:09 compute-1 sshd[1007]: Server listening on 0.0.0.0 port 22.
Nov 24 12:58:09 compute-1 sshd[1007]: Server listening on :: port 22.
Nov 24 12:58:09 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Nov 24 12:58:09 compute-1 sudo[69588]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:09 compute-1 sudo[69744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-risvaxbzgtcelmumjnukstvxzwpnopab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989089.5292766-515-98693548448058/AnsiballZ_file.py'
Nov 24 12:58:09 compute-1 sudo[69744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:09 compute-1 python3.9[69746]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:10 compute-1 sudo[69744]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:10 compute-1 sudo[69896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjufwvtbdndjwvaczqhghaaqgfrmbnhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989090.1727498-531-210409590311990/AnsiballZ_stat.py'
Nov 24 12:58:10 compute-1 sudo[69896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:10 compute-1 python3.9[69898]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:10 compute-1 sudo[69896]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:10 compute-1 sudo[70019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivknfhtauqmetclkctfawiauhcftcidx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989090.1727498-531-210409590311990/AnsiballZ_copy.py'
Nov 24 12:58:10 compute-1 sudo[70019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:11 compute-1 python3.9[70021]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989090.1727498-531-210409590311990/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:11 compute-1 sudo[70019]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:12 compute-1 sudo[70171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fztrzrmadwgujyxiiblodljvbiuotwyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989091.789804-567-12483949343355/AnsiballZ_timezone.py'
Nov 24 12:58:12 compute-1 sudo[70171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:12 compute-1 python3.9[70173]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 24 12:58:12 compute-1 systemd[1]: Starting Time & Date Service...
Nov 24 12:58:12 compute-1 systemd[1]: Started Time & Date Service.
Nov 24 12:58:12 compute-1 sudo[70171]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:13 compute-1 sudo[70327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qijdkfssnxllqupcoimsghozuyuiftbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989092.9990346-585-229867570790092/AnsiballZ_file.py'
Nov 24 12:58:13 compute-1 sudo[70327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:13 compute-1 python3.9[70329]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:13 compute-1 sudo[70327]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:14 compute-1 sudo[70479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbziktbomqodgnkyywalitrjppahujrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989093.734531-601-257698479373549/AnsiballZ_stat.py'
Nov 24 12:58:14 compute-1 sudo[70479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:14 compute-1 python3.9[70481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:14 compute-1 sudo[70479]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:14 compute-1 sudo[70602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmwjfwcpeveytjlkycfpdirkudavyqem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989093.734531-601-257698479373549/AnsiballZ_copy.py'
Nov 24 12:58:14 compute-1 sudo[70602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:14 compute-1 python3.9[70604]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989093.734531-601-257698479373549/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:14 compute-1 sudo[70602]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:15 compute-1 sudo[70754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwzejtizugnovvmbpzysohkbdpbxnmma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989094.962952-631-271711044286162/AnsiballZ_stat.py'
Nov 24 12:58:15 compute-1 sudo[70754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:15 compute-1 python3.9[70756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:15 compute-1 sudo[70754]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:15 compute-1 sudo[70877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbrkzxjnxeejsepfbtzlkgxckodyukvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989094.962952-631-271711044286162/AnsiballZ_copy.py'
Nov 24 12:58:15 compute-1 sudo[70877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:15 compute-1 python3.9[70879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989094.962952-631-271711044286162/.source.yaml _original_basename=.ipu6faxo follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:15 compute-1 sudo[70877]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:16 compute-1 sudo[71029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikovosgqckwxhahcldpxnzvsklajcmur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989096.252472-661-42495353277153/AnsiballZ_stat.py'
Nov 24 12:58:16 compute-1 sudo[71029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:16 compute-1 python3.9[71031]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:16 compute-1 sudo[71029]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:17 compute-1 sudo[71152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkgmdnhwxzvyhajxzrncajqaugjwqals ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989096.252472-661-42495353277153/AnsiballZ_copy.py'
Nov 24 12:58:17 compute-1 sudo[71152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:17 compute-1 python3.9[71154]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989096.252472-661-42495353277153/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:17 compute-1 sudo[71152]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:17 compute-1 sudo[71304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhlppxjsuguasxhhcnrzrfoeocydefnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989097.6721888-692-160324916624911/AnsiballZ_command.py'
Nov 24 12:58:17 compute-1 sudo[71304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:18 compute-1 python3.9[71306]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:58:18 compute-1 sudo[71304]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:18 compute-1 sudo[71457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyukjatkyroluybnhiegucwmluxlpggw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989098.3847096-707-22342885662765/AnsiballZ_command.py'
Nov 24 12:58:18 compute-1 sudo[71457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:18 compute-1 python3.9[71459]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:58:18 compute-1 sudo[71457]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:19 compute-1 sudo[71610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntyrvrlixzzibmtwgzgoojxvfqsdnjlk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989099.149272-723-47671405157419/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 12:58:19 compute-1 sudo[71610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:19 compute-1 python3[71612]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 12:58:19 compute-1 sudo[71610]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:20 compute-1 sudo[71764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luywwohkzpumpzltcelmqfvoexkfvnen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989100.111778-740-219150044829347/AnsiballZ_stat.py'
Nov 24 12:58:20 compute-1 sudo[71764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:20 compute-1 python3.9[71766]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:20 compute-1 sudo[71764]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:20 compute-1 sshd-session[71693]: Received disconnect from 80.94.93.119 port 11400:11:  [preauth]
Nov 24 12:58:20 compute-1 sshd-session[71693]: Disconnected from authenticating user root 80.94.93.119 port 11400 [preauth]
Nov 24 12:58:20 compute-1 sudo[71887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsywwdltzgoqvbzzirrwvzotzfbjpjzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989100.111778-740-219150044829347/AnsiballZ_copy.py'
Nov 24 12:58:20 compute-1 sudo[71887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:21 compute-1 python3.9[71889]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989100.111778-740-219150044829347/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:21 compute-1 sudo[71887]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:21 compute-1 sudo[72039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpgzbsrklynkpocnuxfdmgmtkposyiwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989101.3986793-769-126696185828036/AnsiballZ_stat.py'
Nov 24 12:58:21 compute-1 sudo[72039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:21 compute-1 python3.9[72041]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:21 compute-1 sudo[72039]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:22 compute-1 sudo[72162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtoviovgljenqiqugfztjkbqktvbdwri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989101.3986793-769-126696185828036/AnsiballZ_copy.py'
Nov 24 12:58:22 compute-1 sudo[72162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:22 compute-1 python3.9[72164]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989101.3986793-769-126696185828036/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:22 compute-1 sudo[72162]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:23 compute-1 sudo[72314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wueycauvzqefyjofhkswbthrtkulkitb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989102.758899-800-77637311508148/AnsiballZ_stat.py'
Nov 24 12:58:23 compute-1 sudo[72314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:23 compute-1 python3.9[72316]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:23 compute-1 sudo[72314]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:23 compute-1 sudo[72437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqgkxzpmdhcvetqwbqxpizsxlicovwkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989102.758899-800-77637311508148/AnsiballZ_copy.py'
Nov 24 12:58:23 compute-1 sudo[72437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:23 compute-1 python3.9[72439]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989102.758899-800-77637311508148/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:23 compute-1 sudo[72437]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:24 compute-1 sudo[72589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urjypdbhwcansizibghowudjrsxiquim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989104.165056-829-182944605064670/AnsiballZ_stat.py'
Nov 24 12:58:24 compute-1 sudo[72589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:24 compute-1 python3.9[72591]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:24 compute-1 sudo[72589]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:25 compute-1 sudo[72712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwuhkqhlsxojehlnlcwirkyhhvuhfgml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989104.165056-829-182944605064670/AnsiballZ_copy.py'
Nov 24 12:58:25 compute-1 sudo[72712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:25 compute-1 python3.9[72714]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989104.165056-829-182944605064670/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:25 compute-1 sudo[72712]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:25 compute-1 sudo[72864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zugjeizfdbvznedhaqnzmkydflarydez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989105.4250617-859-202352664702781/AnsiballZ_stat.py'
Nov 24 12:58:25 compute-1 sudo[72864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:25 compute-1 python3.9[72866]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:58:25 compute-1 sudo[72864]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:26 compute-1 sudo[72987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utjjqkaqomslfspnkubaoxeuzxxksbjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989105.4250617-859-202352664702781/AnsiballZ_copy.py'
Nov 24 12:58:26 compute-1 sudo[72987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:26 compute-1 python3.9[72989]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989105.4250617-859-202352664702781/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:26 compute-1 sudo[72987]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:26 compute-1 sudo[73139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwuysxobcxvrfugslppgaxitylphhtar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989106.701953-889-262702021990940/AnsiballZ_file.py'
Nov 24 12:58:26 compute-1 sudo[73139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:27 compute-1 python3.9[73141]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:27 compute-1 sudo[73139]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:27 compute-1 sudo[73291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edfyjutvowohphkawysrpizpzfpaxnjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989107.4440672-905-218666247191366/AnsiballZ_command.py'
Nov 24 12:58:27 compute-1 sudo[73291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:27 compute-1 python3.9[73293]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:58:27 compute-1 sudo[73291]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:28 compute-1 sudo[73450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrkwritpwdsfjxwzwhicbmemoxitsyhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989108.1461325-921-263628914254538/AnsiballZ_blockinfile.py'
Nov 24 12:58:28 compute-1 sudo[73450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:28 compute-1 python3.9[73452]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:28 compute-1 sudo[73450]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:29 compute-1 sudo[73603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymwnzxoxwbsqrdeknrsbqmjkmvbygkgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989109.1444213-939-89671338268962/AnsiballZ_file.py'
Nov 24 12:58:29 compute-1 sudo[73603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:29 compute-1 python3.9[73605]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:29 compute-1 sudo[73603]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:29 compute-1 sudo[73755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kveqicqalpwkoxfcnthufbbtuhmamnjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989109.7159753-939-274169652733054/AnsiballZ_file.py'
Nov 24 12:58:29 compute-1 sudo[73755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:30 compute-1 python3.9[73757]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:30 compute-1 sudo[73755]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:30 compute-1 sudo[73907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xonzchogaexqdvvwhrboufeaaxrkmpwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989110.5096374-969-61222850606336/AnsiballZ_mount.py'
Nov 24 12:58:30 compute-1 sudo[73907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:31 compute-1 python3.9[73909]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 24 12:58:31 compute-1 sudo[73907]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:31 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 12:58:31 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 12:58:31 compute-1 sudo[74061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lotxyfqifxzwyjlxoktbcqzgigpnnegy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989111.350957-969-248711253974761/AnsiballZ_mount.py'
Nov 24 12:58:31 compute-1 sudo[74061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:31 compute-1 python3.9[74063]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 24 12:58:31 compute-1 sudo[74061]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:32 compute-1 sshd-session[64898]: Connection closed by 192.168.122.30 port 57262
Nov 24 12:58:32 compute-1 sshd-session[64895]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:58:32 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Nov 24 12:58:32 compute-1 systemd[1]: session-16.scope: Consumed 34.369s CPU time.
Nov 24 12:58:32 compute-1 systemd-logind[815]: Session 16 logged out. Waiting for processes to exit.
Nov 24 12:58:32 compute-1 systemd-logind[815]: Removed session 16.
Nov 24 12:58:37 compute-1 sshd-session[74089]: Accepted publickey for zuul from 192.168.122.30 port 36834 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:58:37 compute-1 systemd-logind[815]: New session 17 of user zuul.
Nov 24 12:58:37 compute-1 systemd[1]: Started Session 17 of User zuul.
Nov 24 12:58:37 compute-1 sshd-session[74089]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:58:38 compute-1 sudo[74242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnmbpcpbtkygqiqrqecsuudevaqqaubu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989118.0276155-18-72815587787619/AnsiballZ_tempfile.py'
Nov 24 12:58:38 compute-1 sudo[74242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:38 compute-1 python3.9[74244]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 24 12:58:38 compute-1 sudo[74242]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:39 compute-1 sudo[74394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syivphjtbkdkqaifbhkqeybtuccyljtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989118.9169137-42-105045560679341/AnsiballZ_stat.py'
Nov 24 12:58:39 compute-1 sudo[74394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:39 compute-1 python3.9[74396]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:58:39 compute-1 sudo[74394]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:40 compute-1 sudo[74546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvgmqqxggonocawuvtdrxgcsgwbrauwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989119.847897-62-175384968607707/AnsiballZ_setup.py'
Nov 24 12:58:40 compute-1 sudo[74546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:40 compute-1 python3.9[74548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:58:40 compute-1 sudo[74546]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:41 compute-1 sudo[74698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovuwenlcrvbnnzfwphrfxcrpcpnijseo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989121.160505-79-239665179466892/AnsiballZ_blockinfile.py'
Nov 24 12:58:41 compute-1 sudo[74698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:41 compute-1 python3.9[74700]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCj/Yk+1EcGF1m6pab19gzaNkA9lPdjnloOgioy0mYAeATRP7j5ga3kjlWjW19MUwwI4OPMUFVDA3F4H8VqVVTZp4ULgJqcAgxMDSuSdIhakthJhDsEMehoGlHABDLgPlGtZ5WZ/EHbe8eb2s2BmJzhvSOU8QVebMLy5DL+JFDYrsOdFvQeebBUy77z4/7p6E/dQDQsvNE3hUxdzWLGUZQbXxMRMO6uf9j5PNE0HiqlCK5oXfwqIxXlhxrfEGzrzHk/14eTBf+R615lgFNOsgtTYiIIQ3hjpZTL+fH5i3bprVT7Bhj3h8FcYzl86O4hefGMN10ks3kCpA2S+QpoBvNajW5NnLPTC59I1cZNFEMw4EHlhOkmZJkBU/DX3pxTDmjya/at8f728D7kqJN1H5/WnUKsaOleG0hjT1PdlSoRyqRmEk6tfy+8vufxeMerCuTxjClocr4Kttt9ZseN4/io05k4iEo00dLGzzS26bdH5tlNi6SF6dHJ5mt32nxiw5U=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINfyJCtwJfgAVCDAwnsHZ8grLTjfHJu84ysNjSf0WDrs
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJuvAdbbL330IK4vJpPJpyrp02FdoOvUcYgP/rTVj4q1UthqznFc++jFzsb4glLEtEbiWqIR+/hHWBUbdQQsbAk=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC48R6nUwKLxsgAhOmaXu0UKra4RV5I7gYPeQaHF69sS5DuZmggvu8tTAOwonqYWOBGnIx8cpu/38tyW8jTJmr4Y9UYLF+g+j+p5UjNO51X1Z8rNPOk0vwDrZ288Eoyb019scegoRya7CgH+73LktG64/C7KBNa1Rx6gTdk6XSjfjHSIyDhNcRYtlDmFU/eSaSw7SbhNcWN7EraFxr2EdkEzAOTDbwtMAdLIvDPR4qnOnpbaYoa7Zh948X6Z8fdlqSDvOfJeZWv1azhuI9IGDABFcFB458m373ZxXbUfikxc6Ajo/laX65OL1+/OPNJRAspJfofiS2Vjs1W1jr2cqxxapqRn81zCc2whJC6ZqfBZx4WG0sxyQ7HNMTtVCT6iOF9k1FSw2493pXxRl4Z/mQybgKIehCZYiQINgRpTefYD+y4k3Xlr+H9pO3N+jHbxZUe29wNUfEUYPttPvtGnqvdkNQFC8xWMk7T1qPaNAi906nn1OkFohw5mGmbhjKSWE8=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJtGUhwlVQ9OsRFIL3ib/HjbzTAf220QV7qjaecdJHJg
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD7Hni+QLomalr5iPTk3vwv/BtadSAkKDQb1pjKcpcW0In5BHp2c9xDgc2pPe77aLRMY1XhTnzKRHFTp+iyGios=
                                             create=True mode=0644 path=/tmp/ansible.t0hfeaou state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:41 compute-1 sudo[74698]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:42 compute-1 sudo[74850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkumhtxgvlndwsdqhngzodmehbkgtgvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989122.0998662-95-78672323507353/AnsiballZ_command.py'
Nov 24 12:58:42 compute-1 sudo[74850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:42 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 12:58:42 compute-1 python3.9[74852]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.t0hfeaou' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:58:42 compute-1 sudo[74850]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:43 compute-1 sudo[75007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oabxzhpccmhgxcuvibjndqpamzkjokrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989123.0930383-111-271110458550747/AnsiballZ_file.py'
Nov 24 12:58:43 compute-1 sudo[75007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:43 compute-1 python3.9[75009]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.t0hfeaou state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:43 compute-1 sudo[75007]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:44 compute-1 sshd-session[74092]: Connection closed by 192.168.122.30 port 36834
Nov 24 12:58:44 compute-1 sshd-session[74089]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:58:44 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Nov 24 12:58:44 compute-1 systemd[1]: session-17.scope: Consumed 3.235s CPU time.
Nov 24 12:58:44 compute-1 systemd-logind[815]: Session 17 logged out. Waiting for processes to exit.
Nov 24 12:58:44 compute-1 systemd-logind[815]: Removed session 17.
Nov 24 12:58:50 compute-1 sshd-session[75034]: Accepted publickey for zuul from 192.168.122.30 port 36744 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:58:50 compute-1 systemd-logind[815]: New session 18 of user zuul.
Nov 24 12:58:50 compute-1 systemd[1]: Started Session 18 of User zuul.
Nov 24 12:58:50 compute-1 sshd-session[75034]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:58:51 compute-1 python3.9[75187]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:58:52 compute-1 sudo[75341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbebxifiatgeogpbwdmbfzidmxbrsfhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989132.1290722-45-277410092509079/AnsiballZ_systemd.py'
Nov 24 12:58:52 compute-1 sudo[75341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:53 compute-1 python3.9[75343]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 24 12:58:54 compute-1 sudo[75341]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:54 compute-1 sudo[75495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugdbarcjbniuchueepquwynwyexrshju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989134.3749368-61-90085769811857/AnsiballZ_systemd.py'
Nov 24 12:58:54 compute-1 sudo[75495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:54 compute-1 python3.9[75497]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 12:58:54 compute-1 sudo[75495]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:55 compute-1 sshd-session[75498]: Invalid user sol from 45.148.10.240 port 43976
Nov 24 12:58:55 compute-1 sshd-session[75498]: Connection closed by invalid user sol 45.148.10.240 port 43976 [preauth]
Nov 24 12:58:55 compute-1 sudo[75650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chhkzawhkuoppyongmxdrtpryofjtfws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989135.2408056-79-139103413975319/AnsiballZ_command.py'
Nov 24 12:58:55 compute-1 sudo[75650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:55 compute-1 python3.9[75652]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:58:55 compute-1 sudo[75650]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:56 compute-1 sudo[75803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpjxgryefitgczirbjscvzznobjdftnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989136.1319635-95-28945016892554/AnsiballZ_stat.py'
Nov 24 12:58:56 compute-1 sudo[75803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:56 compute-1 python3.9[75805]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:58:56 compute-1 sudo[75803]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:57 compute-1 sudo[75957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvxjimgapjwvbkmfqbhjfsphvzmybmgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989137.0744932-111-28563960248626/AnsiballZ_command.py'
Nov 24 12:58:57 compute-1 sudo[75957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:57 compute-1 python3.9[75959]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:58:57 compute-1 sudo[75957]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:58 compute-1 sudo[76112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yktyhwkayszcdxxitorphjwxpvicsokp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989137.9069643-127-98911197587502/AnsiballZ_file.py'
Nov 24 12:58:58 compute-1 sudo[76112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:58:58 compute-1 python3.9[76114]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:58:58 compute-1 sudo[76112]: pam_unix(sudo:session): session closed for user root
Nov 24 12:58:59 compute-1 sshd-session[75037]: Connection closed by 192.168.122.30 port 36744
Nov 24 12:58:59 compute-1 sshd-session[75034]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:58:59 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Nov 24 12:58:59 compute-1 systemd[1]: session-18.scope: Consumed 4.200s CPU time.
Nov 24 12:58:59 compute-1 systemd-logind[815]: Session 18 logged out. Waiting for processes to exit.
Nov 24 12:58:59 compute-1 systemd-logind[815]: Removed session 18.
Nov 24 12:59:03 compute-1 sshd-session[76139]: Accepted publickey for zuul from 192.168.122.30 port 57852 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:59:03 compute-1 systemd-logind[815]: New session 19 of user zuul.
Nov 24 12:59:03 compute-1 systemd[1]: Started Session 19 of User zuul.
Nov 24 12:59:03 compute-1 sshd-session[76139]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:59:04 compute-1 python3.9[76292]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:59:05 compute-1 sudo[76446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sschyqtadnflenvaylysqycnjberfuhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989145.3783884-49-104965840958228/AnsiballZ_setup.py'
Nov 24 12:59:05 compute-1 sudo[76446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:05 compute-1 python3.9[76448]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 12:59:06 compute-1 sudo[76446]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:06 compute-1 sudo[76530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkkyrfxurshzzpkiqkvulmdwecrgbphg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989145.3783884-49-104965840958228/AnsiballZ_dnf.py'
Nov 24 12:59:06 compute-1 sudo[76530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:06 compute-1 python3.9[76532]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 12:59:08 compute-1 sudo[76530]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:08 compute-1 python3.9[76683]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 12:59:10 compute-1 python3.9[76834]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 12:59:11 compute-1 python3.9[76984]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:59:11 compute-1 python3.9[77134]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 12:59:12 compute-1 sshd-session[76142]: Connection closed by 192.168.122.30 port 57852
Nov 24 12:59:12 compute-1 sshd-session[76139]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:59:12 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Nov 24 12:59:12 compute-1 systemd[1]: session-19.scope: Consumed 5.639s CPU time.
Nov 24 12:59:12 compute-1 systemd-logind[815]: Session 19 logged out. Waiting for processes to exit.
Nov 24 12:59:12 compute-1 systemd-logind[815]: Removed session 19.
Nov 24 12:59:18 compute-1 sshd-session[77159]: Accepted publickey for zuul from 192.168.122.30 port 54968 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 12:59:18 compute-1 systemd-logind[815]: New session 20 of user zuul.
Nov 24 12:59:18 compute-1 systemd[1]: Started Session 20 of User zuul.
Nov 24 12:59:18 compute-1 sshd-session[77159]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 12:59:19 compute-1 python3.9[77312]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 12:59:21 compute-1 sudo[77466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcfbjmljccxatjbamcercsekozskeabv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989160.6462502-80-239468179484538/AnsiballZ_file.py'
Nov 24 12:59:21 compute-1 sudo[77466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:21 compute-1 python3.9[77468]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:21 compute-1 sudo[77466]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:21 compute-1 sudo[77620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkkyevnxdprewktstljvxmqckqkwwoay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989161.4611473-80-179184653386227/AnsiballZ_file.py'
Nov 24 12:59:21 compute-1 sudo[77620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:21 compute-1 python3.9[77622]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:21 compute-1 sudo[77620]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:22 compute-1 sudo[77772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fibhevqvnhenijpkzdcztthugglvnpqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989162.108686-109-259045042495699/AnsiballZ_stat.py'
Nov 24 12:59:22 compute-1 sudo[77772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:22 compute-1 python3.9[77774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:22 compute-1 sudo[77772]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:22 compute-1 sshd-session[77469]: Received disconnect from 61.240.213.113 port 45572:11:  [preauth]
Nov 24 12:59:22 compute-1 sshd-session[77469]: Disconnected from authenticating user root 61.240.213.113 port 45572 [preauth]
Nov 24 12:59:23 compute-1 sudo[77895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pevijbkpsnqukdkjjulbraotzniqznqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989162.108686-109-259045042495699/AnsiballZ_copy.py'
Nov 24 12:59:23 compute-1 sudo[77895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:23 compute-1 python3.9[77897]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989162.108686-109-259045042495699/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=9b87a90365314a57bacd795c88dad9688954ab16 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:23 compute-1 sudo[77895]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:23 compute-1 sudo[78047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsnmysvbyfzxdosvrytjxzsbnjwcfzsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989163.6239278-109-103764469380014/AnsiballZ_stat.py'
Nov 24 12:59:23 compute-1 sudo[78047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:24 compute-1 python3.9[78049]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:24 compute-1 sudo[78047]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:24 compute-1 sudo[78170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foddukozgsgvpirjqyxgukxoaxufwdsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989163.6239278-109-103764469380014/AnsiballZ_copy.py'
Nov 24 12:59:24 compute-1 sudo[78170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:24 compute-1 python3.9[78172]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989163.6239278-109-103764469380014/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=386be81ecb83f48ca8f13d89cba0edda5d94458f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:24 compute-1 sudo[78170]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:25 compute-1 sudo[78322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jztdkjmmtzecirialxpbffmfufzfyoat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989164.7618215-109-3272476669269/AnsiballZ_stat.py'
Nov 24 12:59:25 compute-1 sudo[78322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:25 compute-1 python3.9[78324]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:25 compute-1 sudo[78322]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:25 compute-1 sudo[78445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwiyyjshzsiknhzydvzxziklyspwsyfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989164.7618215-109-3272476669269/AnsiballZ_copy.py'
Nov 24 12:59:25 compute-1 sudo[78445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:25 compute-1 python3.9[78447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989164.7618215-109-3272476669269/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=f96ef2439431c9c17d5f61099a545303236095f5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:25 compute-1 sudo[78445]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:26 compute-1 sudo[78597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbnnpwavqgkqgmiuuprqfarfzlxzmepi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989166.0634573-196-200880031098994/AnsiballZ_file.py'
Nov 24 12:59:26 compute-1 sudo[78597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:26 compute-1 python3.9[78599]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:26 compute-1 sudo[78597]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:26 compute-1 sudo[78749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otztfrzfqqnnwhqktkbxkorzmvirjswv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989166.657933-196-121852487891405/AnsiballZ_file.py'
Nov 24 12:59:26 compute-1 sudo[78749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:27 compute-1 python3.9[78751]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:27 compute-1 sudo[78749]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:27 compute-1 sudo[78901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guuflhvssahirploppejvollfhtbwkiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989167.283266-227-120964269499584/AnsiballZ_stat.py'
Nov 24 12:59:27 compute-1 sudo[78901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:27 compute-1 python3.9[78903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:27 compute-1 sudo[78901]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:28 compute-1 sudo[79024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysdhftgcnvrgozvwiftmveolacyecutf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989167.283266-227-120964269499584/AnsiballZ_copy.py'
Nov 24 12:59:28 compute-1 sudo[79024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:28 compute-1 python3.9[79026]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989167.283266-227-120964269499584/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=daa5a4bf8c72031281e95d3a0560a7f23977cdec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:28 compute-1 sudo[79024]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:28 compute-1 sudo[79176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvdloeyeutwnehbnofmquodalyzepyej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989168.38693-227-112557081543951/AnsiballZ_stat.py'
Nov 24 12:59:28 compute-1 sudo[79176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:28 compute-1 python3.9[79178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:28 compute-1 sudo[79176]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:29 compute-1 sudo[79299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hufmltotinxlaymldrsirpadybnlxlda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989168.38693-227-112557081543951/AnsiballZ_copy.py'
Nov 24 12:59:29 compute-1 sudo[79299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:29 compute-1 python3.9[79301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989168.38693-227-112557081543951/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=f41ebaeccb86de6c2d8e09754514f599dc7195c0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:29 compute-1 sudo[79299]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:29 compute-1 sudo[79451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkxkdghgzninmmmbmvvpcwwjqyjnhqoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989169.4773493-227-118853083730590/AnsiballZ_stat.py'
Nov 24 12:59:29 compute-1 sudo[79451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:29 compute-1 python3.9[79453]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:29 compute-1 sudo[79451]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:30 compute-1 sudo[79574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjcflxcgpfqfciqydnxdouwtxaileryy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989169.4773493-227-118853083730590/AnsiballZ_copy.py'
Nov 24 12:59:30 compute-1 sudo[79574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:30 compute-1 python3.9[79576]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989169.4773493-227-118853083730590/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=f7c9b8b1750aab421b4cf85cc15e79fde9c14b41 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:30 compute-1 sudo[79574]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:30 compute-1 sudo[79726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdgfwfudhwfwzlbiaogwwlvvbfeyjrhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989170.6555269-307-36050159538305/AnsiballZ_file.py'
Nov 24 12:59:30 compute-1 sudo[79726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:31 compute-1 python3.9[79728]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:31 compute-1 sudo[79726]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:31 compute-1 sudo[79878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjqwkxshqzfknxofthtpojnmjvmxvhem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989171.2762678-307-74838157365889/AnsiballZ_file.py'
Nov 24 12:59:31 compute-1 sudo[79878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:31 compute-1 python3.9[79880]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:31 compute-1 sudo[79878]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:32 compute-1 sudo[80030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yarfhdavokcwijgrcncatnkreqgplbcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989172.0153263-338-79031520518920/AnsiballZ_stat.py'
Nov 24 12:59:32 compute-1 sudo[80030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:32 compute-1 python3.9[80032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:32 compute-1 sudo[80030]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:32 compute-1 sudo[80153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skozevlwmmfnqsxdksvaraugvpvapshs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989172.0153263-338-79031520518920/AnsiballZ_copy.py'
Nov 24 12:59:32 compute-1 sudo[80153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:33 compute-1 python3.9[80155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989172.0153263-338-79031520518920/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=955188595bfe7b6d250ad54e2613cff0f82f8ddb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:33 compute-1 sudo[80153]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:33 compute-1 sudo[80305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crzmjgciawukhrcfmneskvrbfscraoul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989173.227468-338-186972080061364/AnsiballZ_stat.py'
Nov 24 12:59:33 compute-1 sudo[80305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:33 compute-1 python3.9[80307]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:33 compute-1 sudo[80305]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:33 compute-1 sudo[80428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grjyqbrtgrbllgykaxygglsytlmackjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989173.227468-338-186972080061364/AnsiballZ_copy.py'
Nov 24 12:59:33 compute-1 sudo[80428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:34 compute-1 python3.9[80430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989173.227468-338-186972080061364/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=72e4ec95a1421482c778aa42b1ea137312a19fe7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:34 compute-1 sudo[80428]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:34 compute-1 sudo[80580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njlywhovxhxcoqqinuginipsnqywexon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989174.2752514-338-50396641302520/AnsiballZ_stat.py'
Nov 24 12:59:34 compute-1 sudo[80580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:34 compute-1 python3.9[80582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:34 compute-1 sudo[80580]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:34 compute-1 sudo[80703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eouokglwqccuqizufoxbcwgoynnpbdyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989174.2752514-338-50396641302520/AnsiballZ_copy.py'
Nov 24 12:59:34 compute-1 sudo[80703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:35 compute-1 python3.9[80705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989174.2752514-338-50396641302520/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=d0cd2cfce30fd1826bac94e173a8493048619619 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:35 compute-1 sudo[80703]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:35 compute-1 sudo[80855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iftkpvgfdqcdvggqfimpsttebgacwale ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989175.3892493-419-94010758081021/AnsiballZ_file.py'
Nov 24 12:59:35 compute-1 sudo[80855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:35 compute-1 python3.9[80857]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:35 compute-1 sudo[80855]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:36 compute-1 sudo[81007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcklrvpbpvyyzqbqhozasrxgtzbjkypv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989175.9604964-419-247885419441651/AnsiballZ_file.py'
Nov 24 12:59:36 compute-1 sudo[81007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:36 compute-1 python3.9[81009]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:36 compute-1 sudo[81007]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:36 compute-1 sudo[81159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nulfqvzqgjtkttvuhkxedlgonjvoszso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989176.6423485-448-252101786495900/AnsiballZ_stat.py'
Nov 24 12:59:36 compute-1 sudo[81159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:37 compute-1 python3.9[81161]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:37 compute-1 sudo[81159]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:37 compute-1 sudo[81282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvdvublpqzzmzqlqinsijqpvygeywqlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989176.6423485-448-252101786495900/AnsiballZ_copy.py'
Nov 24 12:59:37 compute-1 sudo[81282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:37 compute-1 python3.9[81284]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989176.6423485-448-252101786495900/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=7be90c54fe4d5053614bd3ad5b729b5297a1f0da backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:37 compute-1 sudo[81282]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:38 compute-1 sudo[81434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryqjmqcnosxwrdgvexzejqkgphaqlkev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989177.7369287-448-139483261860495/AnsiballZ_stat.py'
Nov 24 12:59:38 compute-1 sudo[81434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:38 compute-1 python3.9[81436]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:38 compute-1 sudo[81434]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:38 compute-1 sudo[81557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeakluzmynaehuflgwethteslydpnstv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989177.7369287-448-139483261860495/AnsiballZ_copy.py'
Nov 24 12:59:38 compute-1 sudo[81557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:38 compute-1 python3.9[81559]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989177.7369287-448-139483261860495/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=72e4ec95a1421482c778aa42b1ea137312a19fe7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:38 compute-1 sudo[81557]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:39 compute-1 sudo[81709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnoralsiahjnhckpwfyqnjhnykvdhjet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989178.9309945-448-246625828288382/AnsiballZ_stat.py'
Nov 24 12:59:39 compute-1 sudo[81709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:39 compute-1 python3.9[81711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:39 compute-1 sudo[81709]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:39 compute-1 sudo[81832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjczgldhhimoiffuxhwgcyxacagdnccm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989178.9309945-448-246625828288382/AnsiballZ_copy.py'
Nov 24 12:59:39 compute-1 sudo[81832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:39 compute-1 python3.9[81834]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989178.9309945-448-246625828288382/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=4d8f1ecfe27708b7849c86a0e5feb75bd7e255c6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:39 compute-1 sudo[81832]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:40 compute-1 sudo[81984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luhgdqjhbwnosqsfqagjldzwtbjgzexn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989180.6352084-552-4561518451462/AnsiballZ_file.py'
Nov 24 12:59:40 compute-1 sudo[81984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:40 compute-1 chronyd[64869]: Selected source 54.39.23.64 (pool.ntp.org)
Nov 24 12:59:41 compute-1 python3.9[81986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:41 compute-1 sudo[81984]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:41 compute-1 sudo[82136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luaoyayzyyxhfuljmygkvglpvkbwcdmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989181.2656922-567-260397525441028/AnsiballZ_stat.py'
Nov 24 12:59:41 compute-1 sudo[82136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:41 compute-1 python3.9[82138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:41 compute-1 sudo[82136]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:42 compute-1 sudo[82259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebhifaoafiuqhgjcsakgvknrscxdorra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989181.2656922-567-260397525441028/AnsiballZ_copy.py'
Nov 24 12:59:42 compute-1 sudo[82259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:42 compute-1 python3.9[82261]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989181.2656922-567-260397525441028/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7555e4abd24fd50381399b8a25576eb603fb2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:42 compute-1 sudo[82259]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:42 compute-1 sudo[82411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixhsdwuwefcbesqcwwkalzqwrbdzarig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989182.4562647-604-142979584511439/AnsiballZ_file.py'
Nov 24 12:59:42 compute-1 sudo[82411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:43 compute-1 python3.9[82413]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:43 compute-1 sudo[82411]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:43 compute-1 sudo[82563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gydxwniyabgakeejrycpyuhneipophrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989183.2021985-623-149741186913099/AnsiballZ_stat.py'
Nov 24 12:59:43 compute-1 sudo[82563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:43 compute-1 python3.9[82565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:43 compute-1 sudo[82563]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:44 compute-1 sudo[82686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmgkftfpluycjfqhkuqghlpysstgsrcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989183.2021985-623-149741186913099/AnsiballZ_copy.py'
Nov 24 12:59:44 compute-1 sudo[82686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:44 compute-1 python3.9[82688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989183.2021985-623-149741186913099/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7555e4abd24fd50381399b8a25576eb603fb2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:44 compute-1 sudo[82686]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:44 compute-1 sudo[82838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hodujcqsnalfocxpozdoorybimntxapg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989184.4068952-653-17904797091205/AnsiballZ_file.py'
Nov 24 12:59:44 compute-1 sudo[82838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:44 compute-1 python3.9[82840]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:44 compute-1 sudo[82838]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:45 compute-1 sudo[82990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfxgegldlfcvyzfkmnjlihhtjvtcevfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989185.0573075-669-510537420460/AnsiballZ_stat.py'
Nov 24 12:59:45 compute-1 sudo[82990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:45 compute-1 python3.9[82992]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:45 compute-1 sudo[82990]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:45 compute-1 sudo[83113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiuawjcaukwoblbcmdvuoxvopfbvnuoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989185.0573075-669-510537420460/AnsiballZ_copy.py'
Nov 24 12:59:45 compute-1 sudo[83113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:45 compute-1 python3.9[83115]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989185.0573075-669-510537420460/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7555e4abd24fd50381399b8a25576eb603fb2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:45 compute-1 sudo[83113]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:46 compute-1 sudo[83265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cefkjegnvdubygdresvctsjvsofpfqii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989186.170715-695-243962545000496/AnsiballZ_file.py'
Nov 24 12:59:46 compute-1 sudo[83265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:46 compute-1 python3.9[83267]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:46 compute-1 sudo[83265]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:47 compute-1 sudo[83417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cclbwajnoqxablsxzwafxvpbcatkcwxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989186.7468176-710-59317552093008/AnsiballZ_stat.py'
Nov 24 12:59:47 compute-1 sudo[83417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:47 compute-1 python3.9[83419]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:47 compute-1 sudo[83417]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:47 compute-1 sudo[83540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdeapbtkwlnnbdjtstoqtnkoblrpuawc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989186.7468176-710-59317552093008/AnsiballZ_copy.py'
Nov 24 12:59:47 compute-1 sudo[83540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:48 compute-1 python3.9[83542]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989186.7468176-710-59317552093008/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7555e4abd24fd50381399b8a25576eb603fb2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:48 compute-1 sudo[83540]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:48 compute-1 sudo[83692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyjrrtxioxtmnvpcksgmfzzsnzbkxkwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989188.3635364-742-257039141704904/AnsiballZ_file.py'
Nov 24 12:59:48 compute-1 sudo[83692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:48 compute-1 python3.9[83694]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:48 compute-1 sudo[83692]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:49 compute-1 sudo[83844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uevajrwwrhikaiatetjuepxmsmaltcaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989189.1026337-758-209283272994071/AnsiballZ_stat.py'
Nov 24 12:59:49 compute-1 sudo[83844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:49 compute-1 python3.9[83846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:49 compute-1 sudo[83844]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:50 compute-1 sudo[83967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdhjqvardkoafhanbkwlbjlkgblveppu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989189.1026337-758-209283272994071/AnsiballZ_copy.py'
Nov 24 12:59:50 compute-1 sudo[83967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:50 compute-1 python3.9[83969]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989189.1026337-758-209283272994071/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7555e4abd24fd50381399b8a25576eb603fb2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:50 compute-1 sudo[83967]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:50 compute-1 sudo[84119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnbyllmrxkipnscnubckiclkeaorkjkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989190.418969-789-216706546291547/AnsiballZ_file.py'
Nov 24 12:59:50 compute-1 sudo[84119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:50 compute-1 python3.9[84121]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:50 compute-1 sudo[84119]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:51 compute-1 sudo[84271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lozjbgazojabasxvuwueobkrvfcuggho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989191.000186-804-207695379062403/AnsiballZ_stat.py'
Nov 24 12:59:51 compute-1 sudo[84271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:51 compute-1 python3.9[84273]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:51 compute-1 sudo[84271]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:51 compute-1 sudo[84394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvigclyvvcyqtcocizrmugpmrvevilov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989191.000186-804-207695379062403/AnsiballZ_copy.py'
Nov 24 12:59:51 compute-1 sudo[84394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:52 compute-1 python3.9[84396]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989191.000186-804-207695379062403/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7555e4abd24fd50381399b8a25576eb603fb2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:52 compute-1 sudo[84394]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:52 compute-1 sudo[84546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxbazhpztdmywxdtmkrqdvlgljrslink ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989192.365276-832-186214649447388/AnsiballZ_file.py'
Nov 24 12:59:52 compute-1 sudo[84546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:52 compute-1 python3.9[84548]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 12:59:52 compute-1 sudo[84546]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:53 compute-1 sudo[84698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qshkmrrhymirirvsllsoqlycsndhktaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989193.0335195-847-720123653866/AnsiballZ_stat.py'
Nov 24 12:59:53 compute-1 sudo[84698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:53 compute-1 python3.9[84700]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 12:59:53 compute-1 sudo[84698]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:53 compute-1 sudo[84821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofijnvyuitmwnxsrnxbslfhwvjvvwkfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989193.0335195-847-720123653866/AnsiballZ_copy.py'
Nov 24 12:59:53 compute-1 sudo[84821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 12:59:54 compute-1 python3.9[84823]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989193.0335195-847-720123653866/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7555e4abd24fd50381399b8a25576eb603fb2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 12:59:54 compute-1 sudo[84821]: pam_unix(sudo:session): session closed for user root
Nov 24 12:59:58 compute-1 sshd-session[77162]: Connection closed by 192.168.122.30 port 54968
Nov 24 12:59:58 compute-1 sshd-session[77159]: pam_unix(sshd:session): session closed for user zuul
Nov 24 12:59:58 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Nov 24 12:59:58 compute-1 systemd[1]: session-20.scope: Consumed 26.574s CPU time.
Nov 24 12:59:58 compute-1 systemd-logind[815]: Session 20 logged out. Waiting for processes to exit.
Nov 24 12:59:58 compute-1 systemd-logind[815]: Removed session 20.
Nov 24 13:00:04 compute-1 sshd-session[84848]: Accepted publickey for zuul from 192.168.122.30 port 39248 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:00:04 compute-1 systemd-logind[815]: New session 21 of user zuul.
Nov 24 13:00:04 compute-1 systemd[1]: Started Session 21 of User zuul.
Nov 24 13:00:04 compute-1 sshd-session[84848]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:00:05 compute-1 python3.9[85001]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:00:06 compute-1 sudo[85155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hapnuuuliwajozyfzgjkellppnfcwwmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989205.789-49-218455055016757/AnsiballZ_file.py'
Nov 24 13:00:06 compute-1 sudo[85155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:06 compute-1 python3.9[85157]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:00:06 compute-1 sudo[85155]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:06 compute-1 sudo[85307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvxfgpxkjroczyqlrrbijhnhyekdinpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989206.5659711-49-252478323871247/AnsiballZ_file.py'
Nov 24 13:00:06 compute-1 sudo[85307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:06 compute-1 python3.9[85309]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:00:07 compute-1 sudo[85307]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:07 compute-1 python3.9[85459]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:00:08 compute-1 sudo[85609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iugjgywdvniudbfjrwztdzhemjzmqkgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989208.0726197-95-3752008073129/AnsiballZ_seboolean.py'
Nov 24 13:00:08 compute-1 sudo[85609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:08 compute-1 python3.9[85611]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 24 13:00:09 compute-1 sudo[85609]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:11 compute-1 sudo[85765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubrhbbboqavvntbfkgihgjezxucnkhnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989211.035809-115-268804362605896/AnsiballZ_setup.py'
Nov 24 13:00:11 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 24 13:00:11 compute-1 sudo[85765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:11 compute-1 python3.9[85767]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 13:00:11 compute-1 sudo[85765]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:12 compute-1 sudo[85849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wndgshuuvdeipxwcaxvgdvayflgptkuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989211.035809-115-268804362605896/AnsiballZ_dnf.py'
Nov 24 13:00:12 compute-1 sudo[85849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:12 compute-1 python3.9[85851]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 13:00:13 compute-1 sudo[85849]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:14 compute-1 sudo[86002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eerxjdfvzpirwrgolrkbfgatgedoiovm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989214.042441-139-75279963948458/AnsiballZ_systemd.py'
Nov 24 13:00:14 compute-1 sudo[86002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:14 compute-1 python3.9[86004]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 13:00:14 compute-1 sudo[86002]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:15 compute-1 sudo[86157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbavcrbwkbimahjauvhrtwroowwaimpo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989215.1896715-155-19666542391865/AnsiballZ_edpm_nftables_snippet.py'
Nov 24 13:00:15 compute-1 sudo[86157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:15 compute-1 python3[86159]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 24 13:00:15 compute-1 sudo[86157]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:16 compute-1 sudo[86309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yltjvuwznyzubwhxsgxigyfbtxhhnspq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989216.2034917-173-162539884901524/AnsiballZ_file.py'
Nov 24 13:00:16 compute-1 sudo[86309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:16 compute-1 python3.9[86311]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:16 compute-1 sudo[86309]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:17 compute-1 sudo[86461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghkhxnhzpurfathpctjrlmauazaexktw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989216.98719-189-38207588894163/AnsiballZ_stat.py'
Nov 24 13:00:17 compute-1 sudo[86461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:17 compute-1 python3.9[86463]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:17 compute-1 sudo[86461]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:17 compute-1 sudo[86539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enrcmmiggiamkkscngcejouwydneixhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989216.98719-189-38207588894163/AnsiballZ_file.py'
Nov 24 13:00:17 compute-1 sudo[86539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:17 compute-1 python3.9[86541]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:18 compute-1 sudo[86539]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:18 compute-1 sudo[86691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykabbhksarewoupwmxkkpllyhhiciwey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989218.320579-213-119214839505658/AnsiballZ_stat.py'
Nov 24 13:00:18 compute-1 sudo[86691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:18 compute-1 python3.9[86693]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:18 compute-1 sudo[86691]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:19 compute-1 sudo[86769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mynszlqrkseychqzfiuuvlvmfmnhzito ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989218.320579-213-119214839505658/AnsiballZ_file.py'
Nov 24 13:00:19 compute-1 sudo[86769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:19 compute-1 python3.9[86771]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hgz8zzl3 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:19 compute-1 sudo[86769]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:19 compute-1 sudo[86921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gumgmbvieeudqcnnociefesueqrqpxaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989219.5802622-237-204630441190319/AnsiballZ_stat.py'
Nov 24 13:00:19 compute-1 sudo[86921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:20 compute-1 python3.9[86923]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:20 compute-1 sudo[86921]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:20 compute-1 sudo[86999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wntldpkiyegbtrhzepjvtsplljdzsiwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989219.5802622-237-204630441190319/AnsiballZ_file.py'
Nov 24 13:00:20 compute-1 sudo[86999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:20 compute-1 python3.9[87001]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:20 compute-1 sudo[86999]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:21 compute-1 sudo[87151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blyovqzhsbnkzllizfqnfmyxknypoyws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989220.8238678-263-223727241624130/AnsiballZ_command.py'
Nov 24 13:00:21 compute-1 sudo[87151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:21 compute-1 python3.9[87153]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:00:21 compute-1 sudo[87151]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:22 compute-1 sudo[87304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwjqdvxycrdvfsvncfhtowecjspsmndv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989221.627479-279-124071388309577/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 13:00:22 compute-1 sudo[87304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:22 compute-1 python3[87306]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 13:00:22 compute-1 sudo[87304]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:22 compute-1 sudo[87456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewmnqmxccpltciowtvqvzboegguhpghn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989222.4189045-295-270852236699553/AnsiballZ_stat.py'
Nov 24 13:00:22 compute-1 sudo[87456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:22 compute-1 python3.9[87458]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:22 compute-1 sudo[87456]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:23 compute-1 sudo[87581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrhwwdgbljvvfbtzratbdsofcmswewtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989222.4189045-295-270852236699553/AnsiballZ_copy.py'
Nov 24 13:00:23 compute-1 sudo[87581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:23 compute-1 python3.9[87583]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989222.4189045-295-270852236699553/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:23 compute-1 sudo[87581]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:24 compute-1 sudo[87733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlcpczbebjohkaxsmdcnizfnnxyyndtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989223.8848124-325-247812267051704/AnsiballZ_stat.py'
Nov 24 13:00:24 compute-1 sudo[87733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:24 compute-1 python3.9[87735]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:24 compute-1 sudo[87733]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:24 compute-1 sudo[87858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekofrsqynajmiktcxljgxmqxhtkgugmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989223.8848124-325-247812267051704/AnsiballZ_copy.py'
Nov 24 13:00:24 compute-1 sudo[87858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:24 compute-1 python3.9[87860]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989223.8848124-325-247812267051704/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:24 compute-1 sudo[87858]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:25 compute-1 sudo[88010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijvwylnndmzrkdcubfqcnwougbgjkclp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989225.127655-355-166568631280271/AnsiballZ_stat.py'
Nov 24 13:00:25 compute-1 sudo[88010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:25 compute-1 python3.9[88012]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:25 compute-1 sudo[88010]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:25 compute-1 sudo[88135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxqvactulavwtnfiymintyckvootqxbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989225.127655-355-166568631280271/AnsiballZ_copy.py'
Nov 24 13:00:25 compute-1 sudo[88135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:26 compute-1 python3.9[88137]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989225.127655-355-166568631280271/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:26 compute-1 sudo[88135]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:26 compute-1 sudo[88287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqtfbukvcxrcenadfzlfsrhmaiixxwit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989226.295632-385-185798234606866/AnsiballZ_stat.py'
Nov 24 13:00:26 compute-1 sudo[88287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:26 compute-1 python3.9[88289]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:26 compute-1 sudo[88287]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:27 compute-1 sudo[88412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oumirplomjvdicoamzkrugpflwvztaqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989226.295632-385-185798234606866/AnsiballZ_copy.py'
Nov 24 13:00:27 compute-1 sudo[88412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:27 compute-1 python3.9[88414]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989226.295632-385-185798234606866/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:27 compute-1 sudo[88412]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:27 compute-1 sudo[88564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzsgfdwmimjfmhcruxvgofmlqfbjvjpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989227.4933836-415-255169687760947/AnsiballZ_stat.py'
Nov 24 13:00:27 compute-1 sudo[88564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:27 compute-1 python3.9[88566]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:28 compute-1 sudo[88564]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:28 compute-1 sudo[88689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prtafhsbybjmidclbbgolqmlxdtkysum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989227.4933836-415-255169687760947/AnsiballZ_copy.py'
Nov 24 13:00:28 compute-1 sudo[88689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:28 compute-1 python3.9[88691]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989227.4933836-415-255169687760947/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:28 compute-1 sudo[88689]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:29 compute-1 sudo[88841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqripmquzonkonpvyhdadqundpmzljlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989228.7686267-445-227480552423342/AnsiballZ_file.py'
Nov 24 13:00:29 compute-1 sudo[88841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:29 compute-1 python3.9[88843]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:29 compute-1 sudo[88841]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:29 compute-1 sudo[88993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgocheixyfomosnkcvuuwzhpvzidrfmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989229.4053767-461-198413966773690/AnsiballZ_command.py'
Nov 24 13:00:29 compute-1 sudo[88993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:29 compute-1 python3.9[88995]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:00:29 compute-1 sudo[88993]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:30 compute-1 sudo[89148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsgrhimaqjgfhmdbzpozwywneudnxgqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989230.0912094-477-12126919272359/AnsiballZ_blockinfile.py'
Nov 24 13:00:30 compute-1 sudo[89148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:30 compute-1 python3.9[89150]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:30 compute-1 sudo[89148]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:31 compute-1 sudo[89300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmxnyaczilzbuobgkkayzgezflacmzbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989231.0399365-495-188520038441868/AnsiballZ_command.py'
Nov 24 13:00:31 compute-1 sudo[89300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:31 compute-1 python3.9[89302]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:00:31 compute-1 sudo[89300]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:31 compute-1 sudo[89453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bobmxdvfoyceggrnmysvxizxwgshhcos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989231.7123141-511-242648680762912/AnsiballZ_stat.py'
Nov 24 13:00:31 compute-1 sudo[89453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:32 compute-1 python3.9[89455]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:00:32 compute-1 sudo[89453]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:32 compute-1 sudo[89607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elvjmsaujepzkbowoqmvzscnwdgayjna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989232.3732855-527-90868078933516/AnsiballZ_command.py'
Nov 24 13:00:32 compute-1 sudo[89607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:32 compute-1 python3.9[89609]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:00:32 compute-1 sudo[89607]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:33 compute-1 sudo[89762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rejuggmnfqmzztbbajhcmtzxldxvxvtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989233.1140347-543-149321847497470/AnsiballZ_file.py'
Nov 24 13:00:33 compute-1 sudo[89762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:33 compute-1 python3.9[89764]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:33 compute-1 sudo[89762]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:34 compute-1 python3.9[89914]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:00:35 compute-1 sudo[90065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzcbewlehvsfqllqztqwmqsubdslzjxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989235.2777126-623-193476983686785/AnsiballZ_command.py'
Nov 24 13:00:35 compute-1 sudo[90065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:35 compute-1 python3.9[90067]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:00:35 compute-1 ovs-vsctl[90068]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 24 13:00:35 compute-1 sudo[90065]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:36 compute-1 sudo[90218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjykjgusxmsscstkdnzrwfuhczxsreim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989236.0138528-641-105992923534466/AnsiballZ_command.py'
Nov 24 13:00:36 compute-1 sudo[90218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:36 compute-1 python3.9[90220]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:00:36 compute-1 sudo[90218]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:36 compute-1 sudo[90373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgkkoprxgvdpivqlyuoqicqzjqpgxfny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989236.7109342-657-208626267085011/AnsiballZ_command.py'
Nov 24 13:00:36 compute-1 sudo[90373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:37 compute-1 python3.9[90375]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:00:37 compute-1 ovs-vsctl[90376]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 24 13:00:37 compute-1 sudo[90373]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:37 compute-1 python3.9[90526]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:00:38 compute-1 sudo[90678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezracztczdqgmvbdbkinnzgejlohevgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989238.0544825-691-233740456457237/AnsiballZ_file.py'
Nov 24 13:00:38 compute-1 sudo[90678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:38 compute-1 python3.9[90680]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:00:38 compute-1 sudo[90678]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:39 compute-1 sudo[90830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miohfynakyridiyfbilgawauckbkiusm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989238.8163927-707-117367617841369/AnsiballZ_stat.py'
Nov 24 13:00:39 compute-1 sudo[90830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:39 compute-1 python3.9[90832]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:39 compute-1 sudo[90830]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:39 compute-1 sudo[90908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsztjpxxovuoojtlkmejrvrkbpijddce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989238.8163927-707-117367617841369/AnsiballZ_file.py'
Nov 24 13:00:39 compute-1 sudo[90908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:39 compute-1 python3.9[90910]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:00:39 compute-1 sudo[90908]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:40 compute-1 sudo[91060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tafnsndmndewnkirxhrunbiaqsmaybly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989239.855407-707-5929901903121/AnsiballZ_stat.py'
Nov 24 13:00:40 compute-1 sudo[91060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:40 compute-1 python3.9[91062]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:40 compute-1 sudo[91060]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:40 compute-1 sudo[91138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsyejsytgyxbuwcrrqdkumppbcywehou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989239.855407-707-5929901903121/AnsiballZ_file.py'
Nov 24 13:00:40 compute-1 sudo[91138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:40 compute-1 python3.9[91140]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:00:40 compute-1 sudo[91138]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:41 compute-1 sudo[91290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-andxhfmoaoxkrdtrejgvwezlswkdluet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989241.390547-753-187416792406973/AnsiballZ_file.py'
Nov 24 13:00:41 compute-1 sudo[91290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:41 compute-1 python3.9[91292]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:41 compute-1 sudo[91290]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:42 compute-1 sudo[91442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjvyqllhcxsymclvxtxpmmulzhqnyswp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989242.1194952-769-121218938801464/AnsiballZ_stat.py'
Nov 24 13:00:42 compute-1 sudo[91442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:42 compute-1 python3.9[91444]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:42 compute-1 sudo[91442]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:42 compute-1 sudo[91520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtuqrqlohbbinbijirtcoppyzhcizphl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989242.1194952-769-121218938801464/AnsiballZ_file.py'
Nov 24 13:00:42 compute-1 sudo[91520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:42 compute-1 python3.9[91522]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:42 compute-1 sudo[91520]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:43 compute-1 sudo[91672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhsaavibocbfcntpbclxmexoyqesoevz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989243.3900042-793-6149580348404/AnsiballZ_stat.py'
Nov 24 13:00:43 compute-1 sudo[91672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:43 compute-1 python3.9[91674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:43 compute-1 sudo[91672]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:44 compute-1 sudo[91750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nspkfhbidcajwwqpceizdsyluntyegwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989243.3900042-793-6149580348404/AnsiballZ_file.py'
Nov 24 13:00:44 compute-1 sudo[91750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:44 compute-1 python3.9[91752]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:44 compute-1 sudo[91750]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:44 compute-1 sudo[91902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqczifatrlkwkezloltqalnsefybjomh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989244.4985144-817-70660699687535/AnsiballZ_systemd.py'
Nov 24 13:00:44 compute-1 sudo[91902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:45 compute-1 python3.9[91904]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:00:45 compute-1 systemd[1]: Reloading.
Nov 24 13:00:45 compute-1 systemd-sysv-generator[91936]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:00:45 compute-1 systemd-rc-local-generator[91933]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:00:45 compute-1 sudo[91902]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:45 compute-1 sudo[92092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hckcyvtfdmchikgrymjlgnyrthhcznum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989245.5356364-833-163559689780141/AnsiballZ_stat.py'
Nov 24 13:00:45 compute-1 sudo[92092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:46 compute-1 python3.9[92094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:46 compute-1 sudo[92092]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:46 compute-1 sudo[92170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzcfbcktcwoemqsmylhccqfvwumlfzvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989245.5356364-833-163559689780141/AnsiballZ_file.py'
Nov 24 13:00:46 compute-1 sudo[92170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:46 compute-1 python3.9[92172]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:46 compute-1 sudo[92170]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:47 compute-1 sudo[92322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ripkkbiepgiglxroqaadihfgyrwnijwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989246.79878-857-31015234471156/AnsiballZ_stat.py'
Nov 24 13:00:47 compute-1 sudo[92322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:47 compute-1 python3.9[92324]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:47 compute-1 sudo[92322]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:47 compute-1 sudo[92400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieimlmmxnflilfeqvhilivybjxuerfln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989246.79878-857-31015234471156/AnsiballZ_file.py'
Nov 24 13:00:47 compute-1 sudo[92400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:47 compute-1 python3.9[92402]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:47 compute-1 sudo[92400]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:48 compute-1 sudo[92552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paadiqqpppwbthyrohwfuvfapxwolyql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989248.0603192-881-268184773008817/AnsiballZ_systemd.py'
Nov 24 13:00:48 compute-1 sudo[92552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:48 compute-1 python3.9[92554]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:00:48 compute-1 systemd[1]: Reloading.
Nov 24 13:00:48 compute-1 systemd-rc-local-generator[92582]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:00:48 compute-1 systemd-sysv-generator[92586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:00:48 compute-1 systemd[1]: Starting Create netns directory...
Nov 24 13:00:48 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 13:00:48 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 13:00:48 compute-1 systemd[1]: Finished Create netns directory.
Nov 24 13:00:48 compute-1 sudo[92552]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:49 compute-1 sudo[92747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drkhgvflbrntsnvsklkikdanuwusedts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989249.3816333-901-28347804554973/AnsiballZ_file.py'
Nov 24 13:00:49 compute-1 sudo[92747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:49 compute-1 python3.9[92749]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:00:49 compute-1 sudo[92747]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:50 compute-1 sudo[92899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiaswxntuvpmrgxivfbacztdqcwxkucz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989250.2237124-917-16248490305559/AnsiballZ_stat.py'
Nov 24 13:00:50 compute-1 sudo[92899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:50 compute-1 python3.9[92901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:50 compute-1 sudo[92899]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:51 compute-1 sudo[93022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxrongvbuxqbxtutfqrcekoyffbfjhad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989250.2237124-917-16248490305559/AnsiballZ_copy.py'
Nov 24 13:00:51 compute-1 sudo[93022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:51 compute-1 python3.9[93024]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989250.2237124-917-16248490305559/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:00:51 compute-1 sudo[93022]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:52 compute-1 sudo[93174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbtplofstcfeeeflezvbnsqvsoamqpmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989251.8773322-951-267179525013317/AnsiballZ_file.py'
Nov 24 13:00:52 compute-1 sudo[93174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:52 compute-1 python3.9[93176]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:00:52 compute-1 sudo[93174]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:52 compute-1 sudo[93326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfowqaddlokkyafuzvujxmtcdaftieng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989252.6257846-967-52794589967267/AnsiballZ_stat.py'
Nov 24 13:00:52 compute-1 sudo[93326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:53 compute-1 python3.9[93328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:00:53 compute-1 sudo[93326]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:53 compute-1 sudo[93449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzwcndzvhcdkarepagvvfdfklxnxyrdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989252.6257846-967-52794589967267/AnsiballZ_copy.py'
Nov 24 13:00:53 compute-1 sudo[93449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:53 compute-1 python3.9[93451]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989252.6257846-967-52794589967267/.source.json _original_basename=.xhgl1gsh follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:53 compute-1 sudo[93449]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:54 compute-1 sudo[93601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbglresbtcgtsloxxqtuqzvldhfeqaqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989253.8993378-997-37369588246868/AnsiballZ_file.py'
Nov 24 13:00:54 compute-1 sudo[93601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:54 compute-1 python3.9[93603]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:00:54 compute-1 sudo[93601]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:54 compute-1 sudo[93753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjzfzrbwqzuwslptbuooimyrnnwibwds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989254.5549161-1013-151003072435315/AnsiballZ_stat.py'
Nov 24 13:00:54 compute-1 sudo[93753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:54 compute-1 sudo[93753]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:55 compute-1 sudo[93876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdwphsylyqjtibgdguffgwtljwrodwob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989254.5549161-1013-151003072435315/AnsiballZ_copy.py'
Nov 24 13:00:55 compute-1 sudo[93876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:55 compute-1 sudo[93876]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:56 compute-1 sudo[94030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pemaghqlyuwuxyuldguwumrzvabqvhgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989256.117342-1047-245546212264329/AnsiballZ_container_config_data.py'
Nov 24 13:00:56 compute-1 sudo[94030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:56 compute-1 python3.9[94032]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 24 13:00:56 compute-1 sudo[94030]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:57 compute-1 sshd-session[93903]: Invalid user ftpuser from 218.56.160.82 port 10478
Nov 24 13:00:57 compute-1 sudo[94184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suwjxsjuiumjbabqrbzbyusulfmxzpth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989257.116699-1065-216971711503050/AnsiballZ_container_config_hash.py'
Nov 24 13:00:57 compute-1 sudo[94184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:57 compute-1 sshd-session[93903]: Received disconnect from 218.56.160.82 port 10478:11: Bye Bye [preauth]
Nov 24 13:00:57 compute-1 sshd-session[93903]: Disconnected from invalid user ftpuser 218.56.160.82 port 10478 [preauth]
Nov 24 13:00:57 compute-1 python3.9[94186]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 13:00:57 compute-1 sudo[94184]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:57 compute-1 sshd-session[94162]: Invalid user sol from 45.148.10.240 port 57036
Nov 24 13:00:57 compute-1 sshd-session[94162]: Connection closed by invalid user sol 45.148.10.240 port 57036 [preauth]
Nov 24 13:00:58 compute-1 sudo[94336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coskxdsaazwyrqugomzjwmwwmbufsgqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989258.0287285-1083-45939970444022/AnsiballZ_podman_container_info.py'
Nov 24 13:00:58 compute-1 sudo[94336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:00:58 compute-1 python3.9[94338]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 13:00:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 13:00:58 compute-1 sudo[94336]: pam_unix(sudo:session): session closed for user root
Nov 24 13:00:59 compute-1 sudo[94500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlmftezrrajsuhnwoxmcqarvaqswujyi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989259.3723626-1109-215888764240135/AnsiballZ_edpm_container_manage.py'
Nov 24 13:00:59 compute-1 sudo[94500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:00 compute-1 python3[94502]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 13:01:00 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 13:01:00 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 13:01:00 compute-1 podman[94537]: 2025-11-24 13:01:00.284544458 +0000 UTC m=+0.042457346 container create e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:01:00 compute-1 podman[94537]: 2025-11-24 13:01:00.262151967 +0000 UTC m=+0.020064865 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 24 13:01:00 compute-1 python3[94502]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 24 13:01:00 compute-1 sudo[94500]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:01 compute-1 sudo[94725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsbsffzyvhhljzfwnmkhdsvyghekkhuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989260.7049444-1125-242842191293653/AnsiballZ_stat.py'
Nov 24 13:01:01 compute-1 sudo[94725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:01 compute-1 CROND[94729]: (root) CMD (run-parts /etc/cron.hourly)
Nov 24 13:01:01 compute-1 run-parts[94732]: (/etc/cron.hourly) starting 0anacron
Nov 24 13:01:01 compute-1 anacron[94740]: Anacron started on 2025-11-24
Nov 24 13:01:01 compute-1 anacron[94740]: Will run job `cron.daily' in 29 min.
Nov 24 13:01:01 compute-1 anacron[94740]: Will run job `cron.weekly' in 49 min.
Nov 24 13:01:01 compute-1 anacron[94740]: Will run job `cron.monthly' in 69 min.
Nov 24 13:01:01 compute-1 anacron[94740]: Jobs will be executed sequentially
Nov 24 13:01:01 compute-1 run-parts[94742]: (/etc/cron.hourly) finished 0anacron
Nov 24 13:01:01 compute-1 CROND[94728]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 24 13:01:01 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 13:01:01 compute-1 python3.9[94727]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:01:01 compute-1 sudo[94725]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:01 compute-1 sudo[94894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpdqmfxlolvkzopqpvbzasohmfffwuya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989261.4816186-1143-97657655015831/AnsiballZ_file.py'
Nov 24 13:01:01 compute-1 sudo[94894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:01 compute-1 python3.9[94896]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:01 compute-1 sudo[94894]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:02 compute-1 sudo[94970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zocrejwvjbybbxvgiwmasgswztfqasfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989261.4816186-1143-97657655015831/AnsiballZ_stat.py'
Nov 24 13:01:02 compute-1 sudo[94970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:02 compute-1 python3.9[94972]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:01:02 compute-1 sudo[94970]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:02 compute-1 sudo[95121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-injlvfgqgpxjfwaupzxvtaqtbrocfjpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989262.3652592-1143-258257055805732/AnsiballZ_copy.py'
Nov 24 13:01:02 compute-1 sudo[95121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:02 compute-1 python3.9[95123]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763989262.3652592-1143-258257055805732/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:03 compute-1 sudo[95121]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:03 compute-1 sudo[95197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnmahzmqeusagyqjmpxbwdyivrgftopt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989262.3652592-1143-258257055805732/AnsiballZ_systemd.py'
Nov 24 13:01:03 compute-1 sudo[95197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:03 compute-1 python3.9[95199]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:01:03 compute-1 systemd[1]: Reloading.
Nov 24 13:01:03 compute-1 systemd-rc-local-generator[95228]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:01:03 compute-1 systemd-sysv-generator[95231]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:01:03 compute-1 sudo[95197]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:04 compute-1 sudo[95309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seqfoonlpwbvtisowsilbyuonoflroce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989262.3652592-1143-258257055805732/AnsiballZ_systemd.py'
Nov 24 13:01:04 compute-1 sudo[95309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:04 compute-1 python3.9[95311]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:01:04 compute-1 systemd[1]: Reloading.
Nov 24 13:01:04 compute-1 systemd-rc-local-generator[95343]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:01:04 compute-1 systemd-sysv-generator[95347]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:01:04 compute-1 systemd[1]: Starting ovn_controller container...
Nov 24 13:01:04 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 24 13:01:04 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:01:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6624f76b880caeacc646b6ae8abbc0d5fc6e499153648caee7485b2c39259d18/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 13:01:04 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321.
Nov 24 13:01:04 compute-1 podman[95353]: 2025-11-24 13:01:04.717783857 +0000 UTC m=+0.113976237 container init e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 13:01:04 compute-1 ovn_controller[95368]: + sudo -E kolla_set_configs
Nov 24 13:01:04 compute-1 podman[95353]: 2025-11-24 13:01:04.741776217 +0000 UTC m=+0.137968567 container start e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 13:01:04 compute-1 edpm-start-podman-container[95353]: ovn_controller
Nov 24 13:01:04 compute-1 systemd[1]: Created slice User Slice of UID 0.
Nov 24 13:01:04 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 24 13:01:04 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 24 13:01:04 compute-1 systemd[1]: Starting User Manager for UID 0...
Nov 24 13:01:04 compute-1 edpm-start-podman-container[95352]: Creating additional drop-in dependency for "ovn_controller" (e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321)
Nov 24 13:01:04 compute-1 systemd[95406]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 24 13:01:04 compute-1 systemd[1]: Reloading.
Nov 24 13:01:04 compute-1 podman[95375]: 2025-11-24 13:01:04.838756614 +0000 UTC m=+0.085798123 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 24 13:01:04 compute-1 systemd-rc-local-generator[95448]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:01:04 compute-1 systemd-sysv-generator[95455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:01:04 compute-1 systemd[95406]: Queued start job for default target Main User Target.
Nov 24 13:01:04 compute-1 systemd[95406]: Created slice User Application Slice.
Nov 24 13:01:04 compute-1 systemd[95406]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 24 13:01:04 compute-1 systemd[95406]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:01:04 compute-1 systemd[95406]: Reached target Paths.
Nov 24 13:01:04 compute-1 systemd[95406]: Reached target Timers.
Nov 24 13:01:04 compute-1 systemd[95406]: Starting D-Bus User Message Bus Socket...
Nov 24 13:01:04 compute-1 systemd[95406]: Starting Create User's Volatile Files and Directories...
Nov 24 13:01:04 compute-1 systemd[95406]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:01:04 compute-1 systemd[95406]: Finished Create User's Volatile Files and Directories.
Nov 24 13:01:04 compute-1 systemd[95406]: Reached target Sockets.
Nov 24 13:01:04 compute-1 systemd[95406]: Reached target Basic System.
Nov 24 13:01:04 compute-1 systemd[95406]: Reached target Main User Target.
Nov 24 13:01:04 compute-1 systemd[95406]: Startup finished in 118ms.
Nov 24 13:01:05 compute-1 systemd[1]: Started User Manager for UID 0.
Nov 24 13:01:05 compute-1 systemd[1]: Started ovn_controller container.
Nov 24 13:01:05 compute-1 systemd[1]: e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321-5d4cd773b54b6e35.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 13:01:05 compute-1 systemd[1]: e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321-5d4cd773b54b6e35.service: Failed with result 'exit-code'.
Nov 24 13:01:05 compute-1 systemd[1]: Started Session c1 of User root.
Nov 24 13:01:05 compute-1 sudo[95309]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:05 compute-1 ovn_controller[95368]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 13:01:05 compute-1 ovn_controller[95368]: INFO:__main__:Validating config file
Nov 24 13:01:05 compute-1 ovn_controller[95368]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 13:01:05 compute-1 ovn_controller[95368]: INFO:__main__:Writing out command to execute
Nov 24 13:01:05 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: ++ cat /run_command
Nov 24 13:01:05 compute-1 ovn_controller[95368]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 24 13:01:05 compute-1 ovn_controller[95368]: + ARGS=
Nov 24 13:01:05 compute-1 ovn_controller[95368]: + sudo kolla_copy_cacerts
Nov 24 13:01:05 compute-1 systemd[1]: Started Session c2 of User root.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: + [[ ! -n '' ]]
Nov 24 13:01:05 compute-1 ovn_controller[95368]: + . kolla_extend_start
Nov 24 13:01:05 compute-1 ovn_controller[95368]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 24 13:01:05 compute-1 ovn_controller[95368]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 24 13:01:05 compute-1 ovn_controller[95368]: + umask 0022
Nov 24 13:01:05 compute-1 ovn_controller[95368]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 24 13:01:05 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.1721] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.1731] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.1741] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.1746] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.1749] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 24 13:01:05 compute-1 kernel: br-int: entered promiscuous mode
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 13:01:05 compute-1 systemd-udevd[95498]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 13:01:05 compute-1 ovn_controller[95368]: 2025-11-24T13:01:05Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.3894] manager: (ovn-475838-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 24 13:01:05 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Nov 24 13:01:05 compute-1 systemd-udevd[95500]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.4089] device (genev_sys_6081): carrier: link connected
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.4093] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 24 13:01:05 compute-1 NetworkManager[55527]: <info>  [1763989265.6056] manager: (ovn-9f02b0-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 24 13:01:05 compute-1 sudo[95628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqqbvlkzoeydjtklguhspuqwupvtkkjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989265.6114643-1199-264380797823633/AnsiballZ_command.py'
Nov 24 13:01:05 compute-1 sudo[95628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:06 compute-1 python3.9[95630]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:01:06 compute-1 ovs-vsctl[95631]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 24 13:01:06 compute-1 sudo[95628]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:06 compute-1 sudo[95781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiqhorvxxcwiiazsskvxaqmjklitvlfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989266.3988261-1215-3488047583803/AnsiballZ_command.py'
Nov 24 13:01:06 compute-1 sudo[95781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:06 compute-1 python3.9[95783]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:01:06 compute-1 ovs-vsctl[95785]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 24 13:01:06 compute-1 sudo[95781]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:07 compute-1 sudo[95936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnixycylkfnlqrsxicxgeoimescqrjtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989267.3201027-1243-12566855738070/AnsiballZ_command.py'
Nov 24 13:01:07 compute-1 sudo[95936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:07 compute-1 python3.9[95938]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:01:07 compute-1 ovs-vsctl[95939]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 24 13:01:07 compute-1 sudo[95936]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:08 compute-1 sshd-session[84851]: Connection closed by 192.168.122.30 port 39248
Nov 24 13:01:08 compute-1 sshd-session[84848]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:01:08 compute-1 systemd[1]: session-21.scope: Deactivated successfully.
Nov 24 13:01:08 compute-1 systemd[1]: session-21.scope: Consumed 42.071s CPU time.
Nov 24 13:01:08 compute-1 systemd-logind[815]: Session 21 logged out. Waiting for processes to exit.
Nov 24 13:01:08 compute-1 systemd-logind[815]: Removed session 21.
Nov 24 13:01:13 compute-1 sshd-session[95964]: Accepted publickey for zuul from 192.168.122.30 port 54878 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:01:13 compute-1 systemd-logind[815]: New session 23 of user zuul.
Nov 24 13:01:13 compute-1 systemd[1]: Started Session 23 of User zuul.
Nov 24 13:01:13 compute-1 sshd-session[95964]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:01:14 compute-1 python3.9[96117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:01:15 compute-1 systemd[1]: Stopping User Manager for UID 0...
Nov 24 13:01:15 compute-1 systemd[95406]: Activating special unit Exit the Session...
Nov 24 13:01:15 compute-1 systemd[95406]: Stopped target Main User Target.
Nov 24 13:01:15 compute-1 systemd[95406]: Stopped target Basic System.
Nov 24 13:01:15 compute-1 systemd[95406]: Stopped target Paths.
Nov 24 13:01:15 compute-1 systemd[95406]: Stopped target Sockets.
Nov 24 13:01:15 compute-1 systemd[95406]: Stopped target Timers.
Nov 24 13:01:15 compute-1 systemd[95406]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:01:15 compute-1 systemd[95406]: Closed D-Bus User Message Bus Socket.
Nov 24 13:01:15 compute-1 systemd[95406]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:01:15 compute-1 systemd[95406]: Removed slice User Application Slice.
Nov 24 13:01:15 compute-1 systemd[95406]: Reached target Shutdown.
Nov 24 13:01:15 compute-1 systemd[95406]: Finished Exit the Session.
Nov 24 13:01:15 compute-1 systemd[95406]: Reached target Exit the Session.
Nov 24 13:01:15 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Nov 24 13:01:15 compute-1 systemd[1]: Stopped User Manager for UID 0.
Nov 24 13:01:15 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 24 13:01:15 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 24 13:01:15 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 24 13:01:15 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 24 13:01:15 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Nov 24 13:01:15 compute-1 sudo[96276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaxdicblanhpdmtccqqrsxtayibdfacn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989275.584661-49-232102278470738/AnsiballZ_file.py'
Nov 24 13:01:15 compute-1 sudo[96276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:16 compute-1 python3.9[96278]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:16 compute-1 sudo[96276]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:16 compute-1 sudo[96428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vciwudpbmlremeimttudiyzolijqsfnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989276.3354049-49-85500903120950/AnsiballZ_file.py'
Nov 24 13:01:16 compute-1 sudo[96428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:16 compute-1 python3.9[96430]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:16 compute-1 sudo[96428]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:17 compute-1 sudo[96580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsuxksjsotyzqcwgqrldfspvggqnnksv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989276.9615052-49-49478287539641/AnsiballZ_file.py'
Nov 24 13:01:17 compute-1 sudo[96580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:17 compute-1 python3.9[96582]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:17 compute-1 sudo[96580]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:17 compute-1 sudo[96733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vevssicrjwjyoyjevrnvpfxntzpbgomt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989277.5587869-49-16104219851693/AnsiballZ_file.py'
Nov 24 13:01:17 compute-1 sudo[96733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:17 compute-1 python3.9[96736]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:18 compute-1 sudo[96733]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:18 compute-1 sshd-session[96732]: Invalid user solana from 193.32.162.145 port 59634
Nov 24 13:01:18 compute-1 sshd-session[96732]: Connection closed by invalid user solana 193.32.162.145 port 59634 [preauth]
Nov 24 13:01:18 compute-1 sudo[96886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vantcvlzgnuudnxbtxovduncllxbirkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989278.1328716-49-240642169673472/AnsiballZ_file.py'
Nov 24 13:01:18 compute-1 sudo[96886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:18 compute-1 python3.9[96888]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:18 compute-1 sudo[96886]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:20 compute-1 python3.9[97038]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:01:20 compute-1 sudo[97188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmhqioypeklzsoionpxxbxouugalrdab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989280.2886727-137-248139843693715/AnsiballZ_seboolean.py'
Nov 24 13:01:20 compute-1 sudo[97188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:21 compute-1 python3.9[97190]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 24 13:01:21 compute-1 sudo[97188]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:22 compute-1 python3.9[97341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:23 compute-1 python3.9[97462]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989281.9255857-153-159490785762666/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:23 compute-1 python3.9[97613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:24 compute-1 python3.9[97734]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989283.5067313-183-71233119150285/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:25 compute-1 sudo[97884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqmgginazvmdzgcayjxcbjyqwllgocdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989285.00805-217-62174135139119/AnsiballZ_setup.py'
Nov 24 13:01:25 compute-1 sudo[97884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:25 compute-1 python3.9[97886]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 13:01:25 compute-1 sudo[97884]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:26 compute-1 sudo[97968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrxsqpdsrzkcmtujtmqibyhjehzersoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989285.00805-217-62174135139119/AnsiballZ_dnf.py'
Nov 24 13:01:26 compute-1 sudo[97968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:26 compute-1 python3.9[97970]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 13:01:27 compute-1 sudo[97968]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:28 compute-1 sudo[98121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppomaawaobqumncjaedqvypedetjiagw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989287.9247565-241-113468509213944/AnsiballZ_systemd.py'
Nov 24 13:01:28 compute-1 sudo[98121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:28 compute-1 python3.9[98123]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 13:01:28 compute-1 sudo[98121]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:29 compute-1 python3.9[98276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:30 compute-1 python3.9[98397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989289.150795-257-136645592647968/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:30 compute-1 python3.9[98548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:31 compute-1 python3.9[98669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989290.3112826-257-263538456854011/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:32 compute-1 sshd-session[97267]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:01:32 compute-1 sshd-session[97267]: banner exchange: Connection from 180.184.134.158 port 55576: Connection timed out
Nov 24 13:01:32 compute-1 python3.9[98819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:33 compute-1 python3.9[98942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989292.3168929-345-265990654788759/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:33 compute-1 python3.9[99092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:34 compute-1 python3.9[99213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989293.4016247-345-106033001664833/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:34 compute-1 sshd-session[98914]: Invalid user stperez from 175.100.24.139 port 53232
Nov 24 13:01:34 compute-1 sshd-session[98914]: Received disconnect from 175.100.24.139 port 53232:11: Bye Bye [preauth]
Nov 24 13:01:34 compute-1 sshd-session[98914]: Disconnected from invalid user stperez 175.100.24.139 port 53232 [preauth]
Nov 24 13:01:35 compute-1 ovn_controller[95368]: 2025-11-24T13:01:35Z|00025|memory|INFO|16384 kB peak resident set size after 30.0 seconds
Nov 24 13:01:35 compute-1 ovn_controller[95368]: 2025-11-24T13:01:35Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Nov 24 13:01:35 compute-1 podman[99337]: 2025-11-24 13:01:35.233387979 +0000 UTC m=+0.115674789 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:01:35 compute-1 python3.9[99375]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:01:36 compute-1 sudo[99541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpvuvztytmhwkbkrteccnaynfrfaooqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989295.7639334-421-135560312273420/AnsiballZ_file.py'
Nov 24 13:01:36 compute-1 sudo[99541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:36 compute-1 python3.9[99543]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:36 compute-1 sudo[99541]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:36 compute-1 sudo[99693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptitczshnmhxalbulnhmqitswatbjxin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989296.563808-437-73015830849177/AnsiballZ_stat.py'
Nov 24 13:01:36 compute-1 sudo[99693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:37 compute-1 python3.9[99695]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:37 compute-1 sudo[99693]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:37 compute-1 sudo[99771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqkfzhthhwmuaxgzyjphhpnfppupefdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989296.563808-437-73015830849177/AnsiballZ_file.py'
Nov 24 13:01:37 compute-1 sudo[99771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:37 compute-1 python3.9[99773]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:37 compute-1 sudo[99771]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:37 compute-1 sudo[99923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgplgrhdfvnufcqtkibzrufdbhwhjeot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989297.6126547-437-135601859289744/AnsiballZ_stat.py'
Nov 24 13:01:37 compute-1 sudo[99923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:38 compute-1 python3.9[99925]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:38 compute-1 sudo[99923]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:38 compute-1 sudo[100001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmrawzkbwefsgswjyxxziohrcowixslt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989297.6126547-437-135601859289744/AnsiballZ_file.py'
Nov 24 13:01:38 compute-1 sudo[100001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:38 compute-1 python3.9[100003]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:38 compute-1 sudo[100001]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:39 compute-1 sudo[100153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwcslemflmjxrctgmbbxorlmjwypftzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989298.9701245-483-245667641315894/AnsiballZ_file.py'
Nov 24 13:01:39 compute-1 sudo[100153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:39 compute-1 python3.9[100155]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:39 compute-1 sudo[100153]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:40 compute-1 sudo[100305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqazmvbdlsxtzibwyrjmmohdpzsocppv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989299.683594-499-106746936022719/AnsiballZ_stat.py'
Nov 24 13:01:40 compute-1 sudo[100305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:40 compute-1 python3.9[100307]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:40 compute-1 sudo[100305]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:40 compute-1 sudo[100383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlbmachyluoftcjrmrpkizeilarjfbnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989299.683594-499-106746936022719/AnsiballZ_file.py'
Nov 24 13:01:40 compute-1 sudo[100383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:40 compute-1 python3.9[100385]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:40 compute-1 sudo[100383]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:41 compute-1 sudo[100535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxhmrsxelwzqykjgexdhhdbbtjcjwgra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989300.9989247-523-152027054630149/AnsiballZ_stat.py'
Nov 24 13:01:41 compute-1 sudo[100535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:41 compute-1 python3.9[100537]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:41 compute-1 sudo[100535]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:41 compute-1 sudo[100613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bechapyzezylijasrwedftqdgffjzwrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989300.9989247-523-152027054630149/AnsiballZ_file.py'
Nov 24 13:01:41 compute-1 sudo[100613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:41 compute-1 python3.9[100615]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:41 compute-1 sudo[100613]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:42 compute-1 sudo[100765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raqbjnyxenprehapjryshxvheycecmbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989302.2836313-547-76328720001337/AnsiballZ_systemd.py'
Nov 24 13:01:42 compute-1 sudo[100765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:42 compute-1 python3.9[100767]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:01:42 compute-1 systemd[1]: Reloading.
Nov 24 13:01:42 compute-1 systemd-rc-local-generator[100791]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:01:42 compute-1 systemd-sysv-generator[100795]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:01:43 compute-1 sudo[100765]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:43 compute-1 sudo[100954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agrkjscssybtcdoptlsdioklyacvorbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989303.5465767-563-56427924501988/AnsiballZ_stat.py'
Nov 24 13:01:43 compute-1 sudo[100954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:43 compute-1 python3.9[100956]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:44 compute-1 sudo[100954]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:44 compute-1 sudo[101032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xejniydxoyzgonobnjndioyfyyfkyhvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989303.5465767-563-56427924501988/AnsiballZ_file.py'
Nov 24 13:01:44 compute-1 sudo[101032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:44 compute-1 python3.9[101034]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:44 compute-1 sudo[101032]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:45 compute-1 sudo[101184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdhzixqqfeoikkcffvplfwrqevltbrpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989304.8469315-587-68042740207492/AnsiballZ_stat.py'
Nov 24 13:01:45 compute-1 sudo[101184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:45 compute-1 python3.9[101186]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:45 compute-1 sudo[101184]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:45 compute-1 sudo[101262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neqjyjctxmrntnktbzieawqfgpxzeetw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989304.8469315-587-68042740207492/AnsiballZ_file.py'
Nov 24 13:01:45 compute-1 sudo[101262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:45 compute-1 python3.9[101264]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:45 compute-1 sudo[101262]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:46 compute-1 sudo[101414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vydhbufzansafkxubtnzkpoxuopyfgxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989306.0765002-611-209525767141393/AnsiballZ_systemd.py'
Nov 24 13:01:46 compute-1 sudo[101414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:46 compute-1 python3.9[101416]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:01:46 compute-1 systemd[1]: Reloading.
Nov 24 13:01:46 compute-1 systemd-sysv-generator[101445]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:01:46 compute-1 systemd-rc-local-generator[101440]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:01:46 compute-1 systemd[1]: Starting Create netns directory...
Nov 24 13:01:46 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 13:01:46 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 13:01:46 compute-1 systemd[1]: Finished Create netns directory.
Nov 24 13:01:46 compute-1 sudo[101414]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:47 compute-1 sudo[101607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwhjjfjfdyctsfqlagwpvyjvurcpwmix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989307.28054-631-151173624939461/AnsiballZ_file.py'
Nov 24 13:01:47 compute-1 sudo[101607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:47 compute-1 python3.9[101609]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:47 compute-1 sudo[101607]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:48 compute-1 sudo[101759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cukpgvqtjzzthxyipzmhusgplzwmxqko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989308.1345396-647-196445555393318/AnsiballZ_stat.py'
Nov 24 13:01:48 compute-1 sudo[101759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:48 compute-1 python3.9[101761]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:48 compute-1 sudo[101759]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:48 compute-1 sudo[101882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uumgkcnnmffafcwklvdxxtpyldojjjdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989308.1345396-647-196445555393318/AnsiballZ_copy.py'
Nov 24 13:01:48 compute-1 sudo[101882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:49 compute-1 python3.9[101884]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989308.1345396-647-196445555393318/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:49 compute-1 sudo[101882]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:49 compute-1 sudo[102034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovdequfcsonckuqpbrwnujrpdcdvtmcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989309.6950061-681-195631776256928/AnsiballZ_file.py'
Nov 24 13:01:49 compute-1 sudo[102034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:50 compute-1 python3.9[102036]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:01:50 compute-1 sudo[102034]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:50 compute-1 sudo[102186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pucolidxwyvvgfhlbmtvtqgfedckhdzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989310.4364183-697-194971988962206/AnsiballZ_stat.py'
Nov 24 13:01:50 compute-1 sudo[102186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:50 compute-1 python3.9[102188]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:01:50 compute-1 sudo[102186]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:51 compute-1 sudo[102309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfngzcnyugvcujsuxdvzhcmscdtkxnca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989310.4364183-697-194971988962206/AnsiballZ_copy.py'
Nov 24 13:01:51 compute-1 sudo[102309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:51 compute-1 python3.9[102311]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989310.4364183-697-194971988962206/.source.json _original_basename=.meq43plj follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:51 compute-1 sudo[102309]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:52 compute-1 sudo[102461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-relzlnksbziyelxvylbxywdtgdzbvihj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989311.819384-727-32659349360304/AnsiballZ_file.py'
Nov 24 13:01:52 compute-1 sudo[102461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:52 compute-1 python3.9[102463]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:52 compute-1 sudo[102461]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:52 compute-1 sudo[102613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzllkcwkvzsqghqquzzinjmidattkjty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989312.5279534-743-228416822528622/AnsiballZ_stat.py'
Nov 24 13:01:52 compute-1 sudo[102613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:53 compute-1 sudo[102613]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:53 compute-1 sudo[102736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nghawszuceredrhrxitfakqgbqexrwiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989312.5279534-743-228416822528622/AnsiballZ_copy.py'
Nov 24 13:01:53 compute-1 sudo[102736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:53 compute-1 sudo[102736]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:54 compute-1 sudo[102888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngkydycmooiumuojykxdelnwgppmyhjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989313.9340727-777-280139349155433/AnsiballZ_container_config_data.py'
Nov 24 13:01:54 compute-1 sudo[102888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:54 compute-1 python3.9[102890]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 24 13:01:54 compute-1 sudo[102888]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:55 compute-1 sudo[103040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geoizxizwospctyizxjbjswcridqkmce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989314.8107119-795-240421847794879/AnsiballZ_container_config_hash.py'
Nov 24 13:01:55 compute-1 sudo[103040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:55 compute-1 python3.9[103042]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 13:01:55 compute-1 sudo[103040]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:56 compute-1 sudo[103192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yogkmrozsdpwcpulzfavvfctkkutwyhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989315.6995862-813-261282821782630/AnsiballZ_podman_container_info.py'
Nov 24 13:01:56 compute-1 sudo[103192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:56 compute-1 python3.9[103194]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 13:01:56 compute-1 sudo[103192]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:57 compute-1 sudo[103371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foknogdfshptoglhblhokywadhpcbcek ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989317.0512686-839-207058981151104/AnsiballZ_edpm_container_manage.py'
Nov 24 13:01:57 compute-1 sudo[103371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:57 compute-1 python3[103373]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 13:01:57 compute-1 podman[103407]: 2025-11-24 13:01:57.975298317 +0000 UTC m=+0.050421929 container create a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 13:01:57 compute-1 podman[103407]: 2025-11-24 13:01:57.948569337 +0000 UTC m=+0.023692939 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:01:57 compute-1 python3[103373]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:01:58 compute-1 sudo[103371]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:58 compute-1 sudo[103595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edoxaaloaozlttfsdhsjjrqobisiwrox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989318.2871401-855-173389828730966/AnsiballZ_stat.py'
Nov 24 13:01:58 compute-1 sudo[103595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:58 compute-1 python3.9[103597]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:01:58 compute-1 sudo[103595]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:59 compute-1 sudo[103749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkcmtytpeamihcjwfqafjgwulzclpneu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989319.087136-873-98522183969932/AnsiballZ_file.py'
Nov 24 13:01:59 compute-1 sudo[103749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:59 compute-1 python3.9[103751]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:01:59 compute-1 sudo[103749]: pam_unix(sudo:session): session closed for user root
Nov 24 13:01:59 compute-1 sudo[103825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjdgknyxsjlutlorviiqeckhhcmgrxva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989319.087136-873-98522183969932/AnsiballZ_stat.py'
Nov 24 13:01:59 compute-1 sudo[103825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:01:59 compute-1 python3.9[103827]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:01:59 compute-1 sudo[103825]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:00 compute-1 sudo[103976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbzjbbhuvogfebymxwqggfvxiglbzsmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989319.9340842-873-207887874726440/AnsiballZ_copy.py'
Nov 24 13:02:00 compute-1 sudo[103976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:00 compute-1 python3.9[103978]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763989319.9340842-873-207887874726440/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:00 compute-1 sudo[103976]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:00 compute-1 sudo[104052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npsntdifvfltngimhtrcqnjjuyvxiyoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989319.9340842-873-207887874726440/AnsiballZ_systemd.py'
Nov 24 13:02:00 compute-1 sudo[104052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:01 compute-1 python3.9[104054]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:02:01 compute-1 systemd[1]: Reloading.
Nov 24 13:02:01 compute-1 systemd-rc-local-generator[104079]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:02:01 compute-1 systemd-sysv-generator[104082]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:02:01 compute-1 sudo[104052]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:01 compute-1 sudo[104162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spvaqvizpgokrceqzlqogpupprwhgwic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989319.9340842-873-207887874726440/AnsiballZ_systemd.py'
Nov 24 13:02:01 compute-1 sudo[104162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:01 compute-1 python3.9[104164]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:02:01 compute-1 systemd[1]: Reloading.
Nov 24 13:02:01 compute-1 systemd-sysv-generator[104195]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:02:01 compute-1 systemd-rc-local-generator[104190]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:02:02 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Nov 24 13:02:02 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:02:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4e9afaa03c7e458fcd6d8fb1457468b37d3ac36fd2a1ff853ab92c6fbd6a890/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 24 13:02:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4e9afaa03c7e458fcd6d8fb1457468b37d3ac36fd2a1ff853ab92c6fbd6a890/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:02:02 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4.
Nov 24 13:02:02 compute-1 podman[104205]: 2025-11-24 13:02:02.275077337 +0000 UTC m=+0.106556158 container init a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + sudo -E kolla_set_configs
Nov 24 13:02:02 compute-1 podman[104205]: 2025-11-24 13:02:02.30308382 +0000 UTC m=+0.134562581 container start a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:02:02 compute-1 edpm-start-podman-container[104205]: ovn_metadata_agent
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Validating config file
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Copying service configuration files
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Writing out command to execute
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: ++ cat /run_command
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + CMD=neutron-ovn-metadata-agent
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + ARGS=
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + sudo kolla_copy_cacerts
Nov 24 13:02:02 compute-1 edpm-start-podman-container[104204]: Creating additional drop-in dependency for "ovn_metadata_agent" (a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4)
Nov 24 13:02:02 compute-1 podman[104227]: 2025-11-24 13:02:02.374601686 +0000 UTC m=+0.058644797 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + [[ ! -n '' ]]
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + . kolla_extend_start
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: Running command: 'neutron-ovn-metadata-agent'
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + umask 0022
Nov 24 13:02:02 compute-1 ovn_metadata_agent[104220]: + exec neutron-ovn-metadata-agent
Nov 24 13:02:02 compute-1 systemd[1]: Reloading.
Nov 24 13:02:02 compute-1 systemd-sysv-generator[104299]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:02:02 compute-1 systemd-rc-local-generator[104291]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:02:02 compute-1 systemd[1]: Started ovn_metadata_agent container.
Nov 24 13:02:02 compute-1 sudo[104162]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:03 compute-1 sshd-session[95967]: Connection closed by 192.168.122.30 port 54878
Nov 24 13:02:03 compute-1 sshd-session[95964]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:02:03 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Nov 24 13:02:03 compute-1 systemd[1]: session-23.scope: Consumed 31.941s CPU time.
Nov 24 13:02:03 compute-1 systemd-logind[815]: Session 23 logged out. Waiting for processes to exit.
Nov 24 13:02:03 compute-1 systemd-logind[815]: Removed session 23.
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.084 104225 INFO neutron.common.config [-] Logging enabled!
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.084 104225 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.084 104225 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.085 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.085 104225 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.085 104225 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.085 104225 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.085 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.085 104225 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.085 104225 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.085 104225 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.086 104225 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.086 104225 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.086 104225 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.086 104225 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.086 104225 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.086 104225 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.086 104225 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.086 104225 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.087 104225 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.087 104225 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.087 104225 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.087 104225 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.087 104225 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.087 104225 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.087 104225 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.087 104225 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.088 104225 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.088 104225 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.088 104225 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.088 104225 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.088 104225 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.089 104225 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.089 104225 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.089 104225 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.089 104225 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.089 104225 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.089 104225 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.089 104225 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.090 104225 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.090 104225 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.090 104225 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.090 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.090 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.091 104225 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.092 104225 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.092 104225 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.092 104225 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.092 104225 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.092 104225 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.092 104225 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.093 104225 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.093 104225 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.093 104225 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.093 104225 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.093 104225 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.093 104225 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.093 104225 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.094 104225 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.094 104225 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.094 104225 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.094 104225 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.094 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.094 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.095 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.095 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.095 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.095 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.095 104225 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.095 104225 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.095 104225 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.096 104225 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.096 104225 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.096 104225 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.096 104225 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.096 104225 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.096 104225 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.097 104225 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.097 104225 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.097 104225 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.097 104225 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.097 104225 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.097 104225 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.097 104225 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.097 104225 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.098 104225 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.098 104225 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.098 104225 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.098 104225 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.098 104225 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.098 104225 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.099 104225 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.099 104225 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.099 104225 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.099 104225 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.099 104225 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.099 104225 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.099 104225 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.100 104225 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.100 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.100 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.100 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.100 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.100 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.100 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.100 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.101 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.101 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.101 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.101 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.101 104225 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.101 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.101 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.101 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.102 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.102 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.102 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.102 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.102 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.102 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.102 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.102 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.103 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.104 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.105 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.106 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.106 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.106 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.106 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.106 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.106 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.106 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.106 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.107 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.107 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.107 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.107 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.107 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.107 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.107 104225 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.107 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.108 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.109 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.110 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.111 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.112 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.113 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.114 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.114 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.114 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.114 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.114 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.114 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.115 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.115 104225 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.115 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.115 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.115 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.115 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.115 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.116 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.116 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.116 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.116 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.116 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.116 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.116 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.116 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.117 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.117 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.117 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.117 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.117 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.117 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.117 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.118 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.118 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.118 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.118 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.118 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.118 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.118 104225 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.119 104225 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.119 104225 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.119 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.119 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.119 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.119 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.119 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.120 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.120 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.120 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.120 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.120 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.120 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.121 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.121 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.121 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.121 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.121 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.121 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.121 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.122 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.122 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.122 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.122 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.122 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.122 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.122 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.123 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.123 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.123 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.123 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.123 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.123 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.123 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.124 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.124 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.124 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.124 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.124 104225 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.124 104225 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.136 104225 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.136 104225 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.136 104225 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.137 104225 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.137 104225 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.153 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 971456df-f9ba-4c8a-bc15-c9feb573d541 (UUID: 971456df-f9ba-4c8a-bc15-c9feb573d541) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.179 104225 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.179 104225 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.179 104225 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.179 104225 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.183 104225 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.188 104225 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.194 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '971456df-f9ba-4c8a-bc15-c9feb573d541'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], external_ids={}, name=971456df-f9ba-4c8a-bc15-c9feb573d541, nb_cfg_timestamp=1763989273266, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.195 104225 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f588fc21b80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.195 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.196 104225 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.196 104225 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.196 104225 INFO oslo_service.service [-] Starting 1 workers
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.200 104225 DEBUG oslo_service.service [-] Started child 104331 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.204 104331 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-177552'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.204 104225 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpv_d1bsyp/privsep.sock']
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.227 104331 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.227 104331 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.227 104331 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.232 104331 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.237 104331 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.244 104331 INFO eventlet.wsgi.server [-] (104331) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 24 13:02:04 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.834 104225 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.834 104225 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpv_d1bsyp/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.730 104336 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.737 104336 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.742 104336 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.742 104336 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104336
Nov 24 13:02:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:04.836 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[2886b284-b3c1-4def-acf4-7b35ce1a62c6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.315 104336 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.316 104336 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.316 104336 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:02:05 compute-1 podman[104341]: 2025-11-24 13:02:05.586665542 +0000 UTC m=+0.126947988 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.844 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[d7920ab2-ec88-4f5c-9381-20d739e29b9b]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.846 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, column=external_ids, values=({'neutron:ovn-metadata-id': 'f8661130-458f-5c51-adaf-e242d3dd6a5f'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.855 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.862 104225 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.862 104225 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.862 104225 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.862 104225 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.862 104225 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.862 104225 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.862 104225 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.863 104225 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.863 104225 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.863 104225 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.863 104225 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.863 104225 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.863 104225 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.863 104225 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.864 104225 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.864 104225 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.864 104225 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.864 104225 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.864 104225 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.864 104225 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.864 104225 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.864 104225 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.865 104225 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.865 104225 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.865 104225 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.865 104225 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.865 104225 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.865 104225 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.865 104225 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.866 104225 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.866 104225 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.866 104225 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.866 104225 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.866 104225 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.866 104225 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.866 104225 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.866 104225 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.867 104225 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.867 104225 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.867 104225 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.867 104225 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.867 104225 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.867 104225 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.867 104225 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.867 104225 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.868 104225 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.868 104225 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.868 104225 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.868 104225 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.868 104225 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.868 104225 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.868 104225 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.868 104225 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.869 104225 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.869 104225 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.869 104225 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.869 104225 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.869 104225 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.869 104225 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.869 104225 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.869 104225 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.870 104225 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.870 104225 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.870 104225 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.870 104225 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.870 104225 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.870 104225 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.870 104225 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.871 104225 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.872 104225 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.872 104225 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.872 104225 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.872 104225 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.873 104225 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.873 104225 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.873 104225 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.873 104225 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.873 104225 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.873 104225 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.873 104225 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.874 104225 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.874 104225 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.874 104225 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.874 104225 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.874 104225 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.874 104225 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.874 104225 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.874 104225 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.875 104225 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.875 104225 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.875 104225 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.875 104225 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.875 104225 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.875 104225 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.875 104225 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.875 104225 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.876 104225 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.876 104225 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.876 104225 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.876 104225 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.876 104225 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.876 104225 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.877 104225 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.877 104225 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.877 104225 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.877 104225 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.877 104225 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.877 104225 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.877 104225 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.877 104225 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.878 104225 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.878 104225 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.878 104225 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.878 104225 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.878 104225 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.878 104225 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.878 104225 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.878 104225 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.879 104225 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.880 104225 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.881 104225 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.882 104225 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.883 104225 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.884 104225 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.885 104225 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.886 104225 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.887 104225 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.888 104225 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.889 104225 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.890 104225 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.891 104225 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.892 104225 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.893 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.894 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.895 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.896 104225 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.897 104225 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.897 104225 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.897 104225 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.897 104225 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:02:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:02:05.897 104225 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 13:02:09 compute-1 sshd-session[104368]: Accepted publickey for zuul from 192.168.122.30 port 40590 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:02:09 compute-1 systemd-logind[815]: New session 24 of user zuul.
Nov 24 13:02:09 compute-1 systemd[1]: Started Session 24 of User zuul.
Nov 24 13:02:09 compute-1 sshd-session[104368]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:02:10 compute-1 python3.9[104521]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:02:10 compute-1 sshd-session[104522]: Received disconnect from 85.209.134.43 port 46220:11: Bye Bye [preauth]
Nov 24 13:02:10 compute-1 sshd-session[104522]: Disconnected from authenticating user root 85.209.134.43 port 46220 [preauth]
Nov 24 13:02:11 compute-1 sudo[104677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqeblkjybbkfgtnqijjmyizkxwfpadxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989331.167188-49-163310976816266/AnsiballZ_command.py'
Nov 24 13:02:11 compute-1 sudo[104677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:11 compute-1 python3.9[104679]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:11 compute-1 sudo[104677]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:12 compute-1 sudo[104842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjwvnqdmpvohyrmchzbdvkwdqlrlvvsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989332.3598423-71-191688552418038/AnsiballZ_systemd_service.py'
Nov 24 13:02:12 compute-1 sudo[104842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:13 compute-1 python3.9[104844]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:02:13 compute-1 systemd[1]: Reloading.
Nov 24 13:02:13 compute-1 systemd-rc-local-generator[104870]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:02:13 compute-1 systemd-sysv-generator[104874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:02:13 compute-1 sudo[104842]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:14 compute-1 python3.9[105028]: ansible-ansible.builtin.service_facts Invoked
Nov 24 13:02:14 compute-1 network[105045]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 13:02:14 compute-1 network[105046]: 'network-scripts' will be removed from distribution in near future.
Nov 24 13:02:14 compute-1 network[105047]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 13:02:20 compute-1 sudo[105306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egiphgkklintcwjbrzjfqqbhghremgmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989340.3615215-109-233821826283771/AnsiballZ_systemd_service.py'
Nov 24 13:02:20 compute-1 sudo[105306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:21 compute-1 python3.9[105308]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:02:21 compute-1 sudo[105306]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:21 compute-1 sudo[105459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzcuzmekqjfszyfswixjtaxwimkzvqmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989341.2620616-109-195898664501152/AnsiballZ_systemd_service.py'
Nov 24 13:02:21 compute-1 sudo[105459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:21 compute-1 python3.9[105461]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:02:21 compute-1 sudo[105459]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:22 compute-1 sudo[105612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-furhyizaoopzquierrtxtmjxzckfoiih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989342.1375473-109-94763915407104/AnsiballZ_systemd_service.py'
Nov 24 13:02:22 compute-1 sudo[105612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:22 compute-1 python3.9[105614]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:02:22 compute-1 sudo[105612]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:23 compute-1 sudo[105765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbhnoxczxgectqywmawpidhxxlawnlog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989342.8754656-109-96661624092103/AnsiballZ_systemd_service.py'
Nov 24 13:02:23 compute-1 sudo[105765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:23 compute-1 python3.9[105767]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:02:23 compute-1 sudo[105765]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:24 compute-1 sudo[105918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fukqnssexjvmoodprxpadqlohzrqmqus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989343.8044221-109-34265016034575/AnsiballZ_systemd_service.py'
Nov 24 13:02:24 compute-1 sudo[105918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:24 compute-1 python3.9[105920]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:02:24 compute-1 sudo[105918]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:24 compute-1 sudo[106071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpmnzikapczwjpnzrgykabrpqesbldei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989344.5495422-109-42416404692943/AnsiballZ_systemd_service.py'
Nov 24 13:02:24 compute-1 sudo[106071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:25 compute-1 python3.9[106073]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:02:25 compute-1 sudo[106071]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:25 compute-1 sudo[106224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prudhwtvdzdizqlhkrfcdtuasrgcrtgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989345.322341-109-186313120369854/AnsiballZ_systemd_service.py'
Nov 24 13:02:25 compute-1 sudo[106224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:25 compute-1 python3.9[106226]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:02:25 compute-1 sudo[106224]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:27 compute-1 sudo[106377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btcqrtdwywplsyxecwircwcgjlnxrmgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989346.7196052-213-19071775639746/AnsiballZ_file.py'
Nov 24 13:02:27 compute-1 sudo[106377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:27 compute-1 python3.9[106379]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:27 compute-1 sudo[106377]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:27 compute-1 sudo[106529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozwphdwalfjqdgkegvpiusleyycnkasw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989347.460946-213-74344444582280/AnsiballZ_file.py'
Nov 24 13:02:27 compute-1 sudo[106529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:27 compute-1 python3.9[106531]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:27 compute-1 sudo[106529]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:28 compute-1 sudo[106681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqzqzkxogzducmtirjojcfgftmpsgllx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989348.0349655-213-231839529108118/AnsiballZ_file.py'
Nov 24 13:02:28 compute-1 sudo[106681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:28 compute-1 python3.9[106683]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:28 compute-1 sudo[106681]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:28 compute-1 sudo[106833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnrsrwyyfggpvifsibiodtsjjguqpyfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989348.600812-213-51294109429416/AnsiballZ_file.py'
Nov 24 13:02:28 compute-1 sudo[106833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:29 compute-1 python3.9[106835]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:29 compute-1 sudo[106833]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:29 compute-1 sudo[106985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgnkmqiifrryymiqnspsbnnyxkgptomq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989349.1509056-213-31017705707815/AnsiballZ_file.py'
Nov 24 13:02:29 compute-1 sudo[106985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:29 compute-1 python3.9[106987]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:29 compute-1 sudo[106985]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:29 compute-1 sudo[107137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmeysqaxjisnflypqashrcifvvrlwwgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989349.7096457-213-10242010665138/AnsiballZ_file.py'
Nov 24 13:02:29 compute-1 sudo[107137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:30 compute-1 python3.9[107139]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:30 compute-1 sudo[107137]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:30 compute-1 sudo[107289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqhsqippnxhbxzxeutjxtuxikvlkpicm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989350.279716-213-205692188423304/AnsiballZ_file.py'
Nov 24 13:02:30 compute-1 sudo[107289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:30 compute-1 python3.9[107291]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:30 compute-1 sudo[107289]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:32 compute-1 sudo[107441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmscjvwgiayoxioodfeockvlymcqwhtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989351.81126-313-222781210090753/AnsiballZ_file.py'
Nov 24 13:02:32 compute-1 sudo[107441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:32 compute-1 python3.9[107443]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:32 compute-1 sudo[107441]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:32 compute-1 podman[107468]: 2025-11-24 13:02:32.509556131 +0000 UTC m=+0.055347928 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 24 13:02:32 compute-1 sudo[107613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrvgvddlkrwpwnwyrvpprzstcldeurjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989352.469335-313-276648642223490/AnsiballZ_file.py'
Nov 24 13:02:32 compute-1 sudo[107613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:32 compute-1 python3.9[107615]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:32 compute-1 sudo[107613]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:33 compute-1 sudo[107765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmshwxtbpbhpeakmfgjcpsvybqrzqobp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989353.0943716-313-110250447620871/AnsiballZ_file.py'
Nov 24 13:02:33 compute-1 sudo[107765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:33 compute-1 python3.9[107767]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:33 compute-1 sudo[107765]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:33 compute-1 sudo[107917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnoitfovozdfydsqihaclsabshtusttw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989353.6572688-313-58146603888476/AnsiballZ_file.py'
Nov 24 13:02:33 compute-1 sudo[107917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:34 compute-1 python3.9[107919]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:34 compute-1 sudo[107917]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:34 compute-1 sudo[108069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntaxyvnametkakarosvixparoftzdcbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989354.2255466-313-10979349170535/AnsiballZ_file.py'
Nov 24 13:02:34 compute-1 sudo[108069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:34 compute-1 python3.9[108071]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:34 compute-1 sudo[108069]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:35 compute-1 sudo[108221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygplfzorlaaxzbrljmayzzxvkidozqln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989354.792689-313-61276523839538/AnsiballZ_file.py'
Nov 24 13:02:35 compute-1 sudo[108221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:35 compute-1 python3.9[108223]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:35 compute-1 sudo[108221]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:35 compute-1 sudo[108390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krqqglpfvrltgspumrpzsrhrswancsli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989355.5074313-313-114876678855506/AnsiballZ_file.py'
Nov 24 13:02:35 compute-1 sudo[108390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:35 compute-1 podman[108347]: 2025-11-24 13:02:35.803192392 +0000 UTC m=+0.091810167 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 13:02:36 compute-1 python3.9[108399]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:02:36 compute-1 sudo[108390]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:37 compute-1 sudo[108553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seynwtkkqikavvpngbddvxdgkxchnrxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989356.9810529-415-236184157395448/AnsiballZ_command.py'
Nov 24 13:02:37 compute-1 sudo[108553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:37 compute-1 python3.9[108555]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:37 compute-1 sudo[108553]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:38 compute-1 python3.9[108707]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 13:02:38 compute-1 sudo[108857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxbohkoetohclcpmcgcvsoqmrzrxcnia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989358.68448-451-7946136192157/AnsiballZ_systemd_service.py'
Nov 24 13:02:38 compute-1 sudo[108857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:39 compute-1 python3.9[108859]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:02:39 compute-1 systemd[1]: Reloading.
Nov 24 13:02:39 compute-1 systemd-rc-local-generator[108888]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:02:39 compute-1 systemd-sysv-generator[108891]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:02:39 compute-1 sudo[108857]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:40 compute-1 sudo[109045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaxmuuaodfcrmhiqwzszngfsocwnjpsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989359.7399757-467-68508356680929/AnsiballZ_command.py'
Nov 24 13:02:40 compute-1 sudo[109045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:40 compute-1 python3.9[109047]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:40 compute-1 sudo[109045]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:40 compute-1 sudo[109198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjxluwpectgeesekemkobypaitwxkwxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989360.3422923-467-182868860028208/AnsiballZ_command.py'
Nov 24 13:02:40 compute-1 sudo[109198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:40 compute-1 python3.9[109200]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:40 compute-1 sudo[109198]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:41 compute-1 sudo[109351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcpykmofqryiwugkwfwyrugptcklzxcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989361.0246058-467-129702820733298/AnsiballZ_command.py'
Nov 24 13:02:41 compute-1 sudo[109351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:41 compute-1 python3.9[109353]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:41 compute-1 sudo[109351]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:41 compute-1 sudo[109504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qedueduysveysazitknjwqijiyckygaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989361.6701167-467-39821758343320/AnsiballZ_command.py'
Nov 24 13:02:41 compute-1 sudo[109504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:42 compute-1 python3.9[109506]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:42 compute-1 sudo[109504]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:42 compute-1 sudo[109657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahyygbsxuuoaqqtbcumfanqbyrocupoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989362.348656-467-219929353461140/AnsiballZ_command.py'
Nov 24 13:02:42 compute-1 sudo[109657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:42 compute-1 python3.9[109659]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:42 compute-1 sudo[109657]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:43 compute-1 sudo[109810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeberpqnjtopgzcqkqrgeekdmlljtpfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989363.0321054-467-44153088473736/AnsiballZ_command.py'
Nov 24 13:02:43 compute-1 sudo[109810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:43 compute-1 python3.9[109812]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:43 compute-1 sudo[109810]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:44 compute-1 sudo[109963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdxlyypdexhzwovplohkxwzbuiqcgwnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989363.6522481-467-47517473439611/AnsiballZ_command.py'
Nov 24 13:02:44 compute-1 sudo[109963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:44 compute-1 python3.9[109965]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:02:44 compute-1 sudo[109963]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:45 compute-1 sudo[110116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urzmvxcbptzdjxwntiecdvxqbokmsoet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989365.5417047-575-92159477565721/AnsiballZ_getent.py'
Nov 24 13:02:45 compute-1 sudo[110116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:46 compute-1 python3.9[110118]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 24 13:02:46 compute-1 sudo[110116]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:47 compute-1 sudo[110269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noqflqpccapmadtbrucczzjxpmbhusrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989366.5191245-591-122255991988616/AnsiballZ_group.py'
Nov 24 13:02:47 compute-1 sudo[110269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:47 compute-1 python3.9[110271]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 13:02:47 compute-1 groupadd[110272]: group added to /etc/group: name=libvirt, GID=42473
Nov 24 13:02:47 compute-1 groupadd[110272]: group added to /etc/gshadow: name=libvirt
Nov 24 13:02:47 compute-1 groupadd[110272]: new group: name=libvirt, GID=42473
Nov 24 13:02:47 compute-1 sudo[110269]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:49 compute-1 sudo[110427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpwdptanroffpkfjlzpuotylqdyvmpui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989368.5876558-607-28118188862763/AnsiballZ_user.py'
Nov 24 13:02:49 compute-1 sudo[110427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:49 compute-1 python3.9[110429]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 13:02:49 compute-1 useradd[110431]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 13:02:49 compute-1 sudo[110427]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:50 compute-1 sudo[110588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gypjyqdkvsiwlyhuvcmvksxjlvhbcfof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989370.0047145-629-195848720160363/AnsiballZ_setup.py'
Nov 24 13:02:50 compute-1 sudo[110588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:50 compute-1 python3.9[110590]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 13:02:50 compute-1 sudo[110588]: pam_unix(sudo:session): session closed for user root
Nov 24 13:02:51 compute-1 sudo[110673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grcqyuifrichsgyggrnxoipyztppieqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989370.0047145-629-195848720160363/AnsiballZ_dnf.py'
Nov 24 13:02:51 compute-1 sudo[110673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:02:51 compute-1 python3.9[110675]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 13:02:54 compute-1 sshd-session[110686]: Invalid user sol from 45.148.10.240 port 44900
Nov 24 13:02:55 compute-1 sshd-session[110686]: Connection closed by invalid user sol 45.148.10.240 port 44900 [preauth]
Nov 24 13:03:01 compute-1 sshd-session[110699]: Invalid user sam from 45.78.217.131 port 44410
Nov 24 13:03:01 compute-1 sshd-session[110699]: Received disconnect from 45.78.217.131 port 44410:11: Bye Bye [preauth]
Nov 24 13:03:01 compute-1 sshd-session[110699]: Disconnected from invalid user sam 45.78.217.131 port 44410 [preauth]
Nov 24 13:03:03 compute-1 podman[110861]: 2025-11-24 13:03:03.516724356 +0000 UTC m=+0.061355298 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 13:03:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:03:04.127 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:03:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:03:04.128 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:03:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:03:04.128 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:03:06 compute-1 podman[110881]: 2025-11-24 13:03:06.565651199 +0000 UTC m=+0.106321269 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 24 13:03:16 compute-1 kernel: SELinux:  Converting 2757 SID table entries...
Nov 24 13:03:16 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 13:03:16 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 24 13:03:16 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 13:03:16 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 24 13:03:16 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 13:03:16 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 13:03:16 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 13:03:26 compute-1 kernel: SELinux:  Converting 2757 SID table entries...
Nov 24 13:03:26 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 13:03:26 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 24 13:03:26 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 13:03:26 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 24 13:03:26 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 13:03:26 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 13:03:26 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 13:03:34 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 24 13:03:34 compute-1 podman[110928]: 2025-11-24 13:03:34.511441335 +0000 UTC m=+0.052204695 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 13:03:37 compute-1 podman[110948]: 2025-11-24 13:03:37.528688639 +0000 UTC m=+0.076708314 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller)
Nov 24 13:04:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:04:04.128 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:04:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:04:04.128 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:04:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:04:04.129 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:04:05 compute-1 podman[127511]: 2025-11-24 13:04:05.521340914 +0000 UTC m=+0.062166889 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Nov 24 13:04:07 compute-1 podman[127778]: 2025-11-24 13:04:07.885461292 +0000 UTC m=+0.104373492 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 13:04:16 compute-1 sshd-session[127813]: Invalid user administrator from 68.183.82.237 port 55816
Nov 24 13:04:17 compute-1 sshd-session[127813]: Received disconnect from 68.183.82.237 port 55816:11: Bye Bye [preauth]
Nov 24 13:04:17 compute-1 sshd-session[127813]: Disconnected from invalid user administrator 68.183.82.237 port 55816 [preauth]
Nov 24 13:04:18 compute-1 kernel: SELinux:  Converting 2758 SID table entries...
Nov 24 13:04:18 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 13:04:18 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 24 13:04:18 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 13:04:18 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 24 13:04:18 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 13:04:18 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 13:04:18 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 13:04:19 compute-1 groupadd[127827]: group added to /etc/group: name=dnsmasq, GID=992
Nov 24 13:04:19 compute-1 groupadd[127827]: group added to /etc/gshadow: name=dnsmasq
Nov 24 13:04:19 compute-1 groupadd[127827]: new group: name=dnsmasq, GID=992
Nov 24 13:04:19 compute-1 useradd[127834]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 24 13:04:19 compute-1 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Nov 24 13:04:19 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 24 13:04:19 compute-1 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Nov 24 13:04:20 compute-1 groupadd[127847]: group added to /etc/group: name=clevis, GID=991
Nov 24 13:04:20 compute-1 groupadd[127847]: group added to /etc/gshadow: name=clevis
Nov 24 13:04:20 compute-1 groupadd[127847]: new group: name=clevis, GID=991
Nov 24 13:04:20 compute-1 useradd[127854]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 24 13:04:20 compute-1 usermod[127864]: add 'clevis' to group 'tss'
Nov 24 13:04:20 compute-1 usermod[127864]: add 'clevis' to shadow group 'tss'
Nov 24 13:04:23 compute-1 polkitd[43597]: Reloading rules
Nov 24 13:04:23 compute-1 polkitd[43597]: Collecting garbage unconditionally...
Nov 24 13:04:23 compute-1 polkitd[43597]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 13:04:23 compute-1 polkitd[43597]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 13:04:23 compute-1 polkitd[43597]: Finished loading, compiling and executing 3 rules
Nov 24 13:04:23 compute-1 polkitd[43597]: Reloading rules
Nov 24 13:04:23 compute-1 polkitd[43597]: Collecting garbage unconditionally...
Nov 24 13:04:23 compute-1 polkitd[43597]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 13:04:23 compute-1 polkitd[43597]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 13:04:23 compute-1 polkitd[43597]: Finished loading, compiling and executing 3 rules
Nov 24 13:04:24 compute-1 groupadd[128051]: group added to /etc/group: name=ceph, GID=167
Nov 24 13:04:24 compute-1 groupadd[128051]: group added to /etc/gshadow: name=ceph
Nov 24 13:04:24 compute-1 groupadd[128051]: new group: name=ceph, GID=167
Nov 24 13:04:24 compute-1 useradd[128057]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 24 13:04:27 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Nov 24 13:04:27 compute-1 sshd[1007]: Received signal 15; terminating.
Nov 24 13:04:27 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Nov 24 13:04:27 compute-1 systemd[1]: sshd.service: Unit process 110514 (sshd-session) remains running after unit stopped.
Nov 24 13:04:27 compute-1 systemd[1]: sshd.service: Unit process 110595 (sshd-session) remains running after unit stopped.
Nov 24 13:04:27 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Nov 24 13:04:27 compute-1 systemd[1]: sshd.service: Consumed 2.691s CPU time, 12.6M memory peak, read 564.0K from disk, written 116.0K to disk.
Nov 24 13:04:27 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Nov 24 13:04:27 compute-1 systemd[1]: Stopping sshd-keygen.target...
Nov 24 13:04:27 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 13:04:27 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 13:04:27 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 13:04:27 compute-1 systemd[1]: Reached target sshd-keygen.target.
Nov 24 13:04:27 compute-1 systemd[1]: Starting OpenSSH server daemon...
Nov 24 13:04:27 compute-1 sshd[128576]: Server listening on 0.0.0.0 port 22.
Nov 24 13:04:27 compute-1 sshd[128576]: Server listening on :: port 22.
Nov 24 13:04:27 compute-1 systemd[1]: Started OpenSSH server daemon.
Nov 24 13:04:29 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 13:04:29 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 24 13:04:29 compute-1 systemd[1]: Reloading.
Nov 24 13:04:29 compute-1 systemd-sysv-generator[128838]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:29 compute-1 systemd-rc-local-generator[128834]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:29 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 13:04:31 compute-1 sshd-session[128769]: Connection closed by authenticating user root 80.94.95.115 port 25664 [preauth]
Nov 24 13:04:32 compute-1 sshd-session[130883]: Invalid user jito from 193.32.162.145 port 44916
Nov 24 13:04:32 compute-1 sudo[110673]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:32 compute-1 sshd-session[130883]: Connection closed by invalid user jito 193.32.162.145 port 44916 [preauth]
Nov 24 13:04:36 compute-1 podman[136562]: 2025-11-24 13:04:36.521589031 +0000 UTC m=+0.063868781 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 24 13:04:37 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 13:04:37 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 24 13:04:37 compute-1 systemd[1]: man-db-cache-update.service: Consumed 9.934s CPU time.
Nov 24 13:04:37 compute-1 systemd[1]: run-r565be5a7528641cab3f7721ad6629ab0.service: Deactivated successfully.
Nov 24 13:04:38 compute-1 podman[137263]: 2025-11-24 13:04:38.554134679 +0000 UTC m=+0.106526467 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 13:04:40 compute-1 sshd-session[137287]: Invalid user ftpuser from 176.114.89.34 port 60852
Nov 24 13:04:40 compute-1 sshd-session[137287]: Received disconnect from 176.114.89.34 port 60852:11: Bye Bye [preauth]
Nov 24 13:04:40 compute-1 sshd-session[137287]: Disconnected from invalid user ftpuser 176.114.89.34 port 60852 [preauth]
Nov 24 13:04:42 compute-1 sudo[137414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sapwvwqhzzhnnbpujhpzggpotyedkmwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989482.0009797-653-189017399894321/AnsiballZ_systemd.py'
Nov 24 13:04:42 compute-1 sudo[137414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:42 compute-1 python3.9[137416]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 13:04:43 compute-1 systemd[1]: Reloading.
Nov 24 13:04:43 compute-1 systemd-rc-local-generator[137445]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:43 compute-1 systemd-sysv-generator[137448]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:43 compute-1 sudo[137414]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:43 compute-1 sudo[137603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyqdpnlwdoiwckswnyowlfzojshrjuhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989483.4583824-653-204496148107559/AnsiballZ_systemd.py'
Nov 24 13:04:43 compute-1 sudo[137603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:44 compute-1 python3.9[137605]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 13:04:44 compute-1 systemd[1]: Reloading.
Nov 24 13:04:44 compute-1 systemd-sysv-generator[137638]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:44 compute-1 systemd-rc-local-generator[137632]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:44 compute-1 sudo[137603]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:44 compute-1 sudo[137792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpnqftvqbjzehpliwwxqonwxxggglmvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989484.57249-653-173775035285374/AnsiballZ_systemd.py'
Nov 24 13:04:44 compute-1 sudo[137792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:45 compute-1 python3.9[137794]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 13:04:45 compute-1 systemd[1]: Reloading.
Nov 24 13:04:45 compute-1 systemd-rc-local-generator[137824]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:45 compute-1 systemd-sysv-generator[137828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:45 compute-1 sudo[137792]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:46 compute-1 sudo[137983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbxtnlpruydklmemjlkpvyqzyrpuqgas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989485.78828-653-269903123401620/AnsiballZ_systemd.py'
Nov 24 13:04:46 compute-1 sudo[137983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:46 compute-1 python3.9[137985]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 13:04:47 compute-1 systemd[1]: Reloading.
Nov 24 13:04:47 compute-1 systemd-rc-local-generator[138013]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:47 compute-1 systemd-sysv-generator[138018]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:47 compute-1 sudo[137983]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:48 compute-1 sudo[138173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzgtigckjgysetlitiklfnwmsqmjgwwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989487.916789-711-236155323918538/AnsiballZ_systemd.py'
Nov 24 13:04:48 compute-1 sudo[138173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:48 compute-1 python3.9[138175]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:04:48 compute-1 systemd[1]: Reloading.
Nov 24 13:04:48 compute-1 systemd-rc-local-generator[138205]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:48 compute-1 systemd-sysv-generator[138209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:48 compute-1 sudo[138173]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:49 compute-1 sudo[138363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbegnsghzpfjoeudftdjpndtvuoxwbwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989489.1321912-711-132082672553851/AnsiballZ_systemd.py'
Nov 24 13:04:49 compute-1 sudo[138363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:49 compute-1 python3.9[138365]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:04:49 compute-1 systemd[1]: Reloading.
Nov 24 13:04:50 compute-1 systemd-sysv-generator[138399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:50 compute-1 systemd-rc-local-generator[138396]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:50 compute-1 sudo[138363]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:50 compute-1 sudo[138553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdywqzsxtliwtylvnjgbwwnawmhyfdkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989490.4474862-711-87787770194163/AnsiballZ_systemd.py'
Nov 24 13:04:50 compute-1 sudo[138553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:51 compute-1 python3.9[138555]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:04:51 compute-1 systemd[1]: Reloading.
Nov 24 13:04:51 compute-1 systemd-rc-local-generator[138582]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:51 compute-1 systemd-sysv-generator[138587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:51 compute-1 sudo[138553]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:51 compute-1 sudo[138742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjgcfsxrbghsnuuhlkshrmheeragagis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989491.5338905-711-218309009814878/AnsiballZ_systemd.py'
Nov 24 13:04:51 compute-1 sudo[138742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:52 compute-1 python3.9[138744]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:04:52 compute-1 sudo[138742]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:52 compute-1 sudo[138897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-somwdtookyxoziqhusfofoxpogrhfmik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989492.4070334-711-153103042591411/AnsiballZ_systemd.py'
Nov 24 13:04:52 compute-1 sudo[138897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:53 compute-1 python3.9[138899]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:04:53 compute-1 systemd[1]: Reloading.
Nov 24 13:04:53 compute-1 systemd-sysv-generator[138933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:53 compute-1 systemd-rc-local-generator[138929]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:53 compute-1 sudo[138897]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:55 compute-1 sudo[139086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cacthtjwyqpwvglgdjmsmboqpyjfoqrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989494.944381-783-278152198989060/AnsiballZ_systemd.py'
Nov 24 13:04:55 compute-1 sudo[139086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:55 compute-1 python3.9[139088]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 13:04:55 compute-1 systemd[1]: Reloading.
Nov 24 13:04:55 compute-1 systemd-rc-local-generator[139118]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:04:55 compute-1 systemd-sysv-generator[139123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:04:56 compute-1 sshd-session[139089]: Invalid user sol from 45.148.10.240 port 33108
Nov 24 13:04:56 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 24 13:04:56 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 24 13:04:56 compute-1 sshd-session[139089]: Connection closed by invalid user sol 45.148.10.240 port 33108 [preauth]
Nov 24 13:04:56 compute-1 sudo[139086]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:56 compute-1 sudo[139280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojsubkifgyzgfoyesjdrouddhxkbpmix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989496.4304385-799-345278501902/AnsiballZ_systemd.py'
Nov 24 13:04:56 compute-1 sudo[139280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:57 compute-1 python3.9[139282]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:04:57 compute-1 sudo[139280]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:57 compute-1 sudo[139435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iptnuxxmdsbxwokovawfnwztgrzpgtmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989497.3848143-799-94856944575496/AnsiballZ_systemd.py'
Nov 24 13:04:57 compute-1 sudo[139435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:04:58 compute-1 python3.9[139437]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:04:59 compute-1 sudo[139435]: pam_unix(sudo:session): session closed for user root
Nov 24 13:04:59 compute-1 sudo[139590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryagbihvudoccuhsuarxzjmmmhgmfbiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989499.2962766-799-67978858549221/AnsiballZ_systemd.py'
Nov 24 13:04:59 compute-1 sudo[139590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:00 compute-1 python3.9[139592]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:01 compute-1 sudo[139590]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:01 compute-1 sudo[139745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozirispoaemnddalhmqzenwhcpzptbhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989501.396851-799-28786240339800/AnsiballZ_systemd.py'
Nov 24 13:05:01 compute-1 sudo[139745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:01 compute-1 python3.9[139747]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:02 compute-1 sudo[139745]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:02 compute-1 sudo[139900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfmjltevaexcnquiqambllzmpwkhunpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989502.2625418-799-41597225988695/AnsiballZ_systemd.py'
Nov 24 13:05:02 compute-1 sudo[139900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:02 compute-1 python3.9[139902]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:02 compute-1 sudo[139900]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:03 compute-1 sudo[140055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtwncwqwefnrqmiubkghlobfqykpsqeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989503.134111-799-256652004685401/AnsiballZ_systemd.py'
Nov 24 13:05:03 compute-1 sudo[140055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:03 compute-1 python3.9[140057]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:03 compute-1 sudo[140055]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:05:04.128 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:05:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:05:04.130 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:05:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:05:04.130 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:05:04 compute-1 sudo[140210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrntypvtbwsfurxdyltxpfuxdtjjwtpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989504.0118995-799-170070534040556/AnsiballZ_systemd.py'
Nov 24 13:05:04 compute-1 sudo[140210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:04 compute-1 python3.9[140212]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:04 compute-1 sudo[140210]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:05 compute-1 sudo[140365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzvgrjyjwjccsgmxicyvgpvqdfwmvxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989504.8012743-799-229171153892969/AnsiballZ_systemd.py'
Nov 24 13:05:05 compute-1 sudo[140365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:05 compute-1 python3.9[140367]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:05 compute-1 sudo[140365]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:05 compute-1 sudo[140520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-donwusmbhuoleigirmpqvlyxwmixmkxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989505.5637887-799-238633532218441/AnsiballZ_systemd.py'
Nov 24 13:05:05 compute-1 sudo[140520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:06 compute-1 python3.9[140522]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:06 compute-1 sudo[140520]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:06 compute-1 sudo[140688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioffnockcvqgmziinepoqhbennsymxvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989506.308583-799-208571453703228/AnsiballZ_systemd.py'
Nov 24 13:05:06 compute-1 sudo[140688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:06 compute-1 podman[140649]: 2025-11-24 13:05:06.636758957 +0000 UTC m=+0.052604858 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 24 13:05:06 compute-1 python3.9[140696]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:06 compute-1 sudo[140688]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:07 compute-1 sudo[140849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cupqbpcutlyaseqbfxhzfrwtvgfouujg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989507.1135938-799-252127934008070/AnsiballZ_systemd.py'
Nov 24 13:05:07 compute-1 sudo[140849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:07 compute-1 python3.9[140851]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:07 compute-1 sudo[140849]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:08 compute-1 sudo[141004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmdzmrjdabiyiokfszvkawwbbvtosvxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989507.8776598-799-235597781169344/AnsiballZ_systemd.py'
Nov 24 13:05:08 compute-1 sudo[141004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:08 compute-1 python3.9[141006]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:08 compute-1 sudo[141004]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:09 compute-1 sudo[141170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmomclafivcpeuphlikdirhdihutzahz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989508.7846906-799-213660179243240/AnsiballZ_systemd.py'
Nov 24 13:05:09 compute-1 sudo[141170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:09 compute-1 podman[141133]: 2025-11-24 13:05:09.240254918 +0000 UTC m=+0.210599915 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:05:09 compute-1 python3.9[141179]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:09 compute-1 sudo[141170]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:09 compute-1 sudo[141338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gndsxrwmybalchfgvwvtwhwubtfpjhkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989509.5785527-799-14174059324900/AnsiballZ_systemd.py'
Nov 24 13:05:09 compute-1 sudo[141338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:10 compute-1 python3.9[141340]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 13:05:10 compute-1 sudo[141338]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:11 compute-1 sudo[141493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vixzqfxospetfxgifwlvejbcsalptvao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989510.730069-1003-280823140359679/AnsiballZ_file.py'
Nov 24 13:05:11 compute-1 sudo[141493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:11 compute-1 python3.9[141495]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:05:11 compute-1 sudo[141493]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:11 compute-1 sudo[141645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hybinihaloqtovcakufxwtiqzngkwsbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989511.4890473-1003-133767617543775/AnsiballZ_file.py'
Nov 24 13:05:11 compute-1 sudo[141645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:11 compute-1 python3.9[141647]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:05:11 compute-1 sudo[141645]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:12 compute-1 sudo[141797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bllgthdcnqtqkwbfsfyktobpcsksgolo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989512.0905015-1003-53437832862108/AnsiballZ_file.py'
Nov 24 13:05:12 compute-1 sudo[141797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:12 compute-1 python3.9[141799]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:05:12 compute-1 sudo[141797]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:13 compute-1 sudo[141949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcczlwxawdeuxedhmotrqdkhpdsphxuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989512.8213973-1003-241553930401289/AnsiballZ_file.py'
Nov 24 13:05:13 compute-1 sudo[141949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:13 compute-1 python3.9[141951]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:05:13 compute-1 sudo[141949]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:13 compute-1 sudo[142101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idwfmfqfifosgqdetzzsmeldjvgvoqab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989513.5613604-1003-258740856108314/AnsiballZ_file.py'
Nov 24 13:05:13 compute-1 sudo[142101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:14 compute-1 python3.9[142103]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:05:14 compute-1 sudo[142101]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:14 compute-1 sudo[142253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiyntatzveivfujupheiolqulqlagsts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989514.2445545-1003-4390583664969/AnsiballZ_file.py'
Nov 24 13:05:14 compute-1 sudo[142253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:14 compute-1 python3.9[142255]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:05:14 compute-1 sudo[142253]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:15 compute-1 sudo[142407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whylqfpcauticrsnaqjvmnxqoavpdrvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989515.4402905-1089-27853047954238/AnsiballZ_stat.py'
Nov 24 13:05:15 compute-1 sudo[142407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:15 compute-1 sshd-session[142280]: Invalid user admin123 from 5.198.176.28 port 42380
Nov 24 13:05:16 compute-1 python3.9[142409]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:16 compute-1 sudo[142407]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:16 compute-1 sshd-session[142280]: Received disconnect from 5.198.176.28 port 42380:11: Bye Bye [preauth]
Nov 24 13:05:16 compute-1 sshd-session[142280]: Disconnected from invalid user admin123 5.198.176.28 port 42380 [preauth]
Nov 24 13:05:16 compute-1 sudo[142532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bscqsvydnujfcbibvbrnrazrabgfhtvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989515.4402905-1089-27853047954238/AnsiballZ_copy.py'
Nov 24 13:05:16 compute-1 sudo[142532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:16 compute-1 python3.9[142534]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763989515.4402905-1089-27853047954238/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:16 compute-1 sudo[142532]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:17 compute-1 sudo[142684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsugcbviilfykdeemissjekjfddmkcit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989517.0476332-1089-224005119296308/AnsiballZ_stat.py'
Nov 24 13:05:17 compute-1 sudo[142684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:17 compute-1 python3.9[142686]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:17 compute-1 sudo[142684]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:17 compute-1 sudo[142809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gowqxrlacujnljjtsehirhwvnjuyapdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989517.0476332-1089-224005119296308/AnsiballZ_copy.py'
Nov 24 13:05:17 compute-1 sudo[142809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:18 compute-1 python3.9[142811]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763989517.0476332-1089-224005119296308/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:18 compute-1 sudo[142809]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:18 compute-1 sudo[142961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfmvlnggnsnpjfkhirbbxoyuguixizoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989518.3614953-1089-258021531598482/AnsiballZ_stat.py'
Nov 24 13:05:18 compute-1 sudo[142961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:18 compute-1 python3.9[142963]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:18 compute-1 sudo[142961]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:19 compute-1 sudo[143086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmpvclszxbbojwqpbjgjrshzkmggocjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989518.3614953-1089-258021531598482/AnsiballZ_copy.py'
Nov 24 13:05:19 compute-1 sudo[143086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:19 compute-1 python3.9[143088]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763989518.3614953-1089-258021531598482/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:19 compute-1 sudo[143086]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:19 compute-1 sudo[143238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryoprfugnuwcdicfkrofhogikpzbzjkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989519.6359856-1089-224951743973324/AnsiballZ_stat.py'
Nov 24 13:05:19 compute-1 sudo[143238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:20 compute-1 python3.9[143240]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:20 compute-1 sudo[143238]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:20 compute-1 sudo[143363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzoemzkneuwlvuxvimtnwvpmrgdohbyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989519.6359856-1089-224951743973324/AnsiballZ_copy.py'
Nov 24 13:05:20 compute-1 sudo[143363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:20 compute-1 python3.9[143365]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763989519.6359856-1089-224951743973324/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:20 compute-1 sudo[143363]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:21 compute-1 sudo[143515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agujhxqxilheyrfkuimmeodzzhzaosbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989520.8729362-1089-27157506226048/AnsiballZ_stat.py'
Nov 24 13:05:21 compute-1 sudo[143515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:21 compute-1 python3.9[143517]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:21 compute-1 sudo[143515]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:21 compute-1 sudo[143640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiugeabtetawexsrhhoenenfnfdsvlai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989520.8729362-1089-27157506226048/AnsiballZ_copy.py'
Nov 24 13:05:21 compute-1 sudo[143640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:21 compute-1 python3.9[143642]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763989520.8729362-1089-27157506226048/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:21 compute-1 sudo[143640]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:22 compute-1 sudo[143792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glnbiwaudzatocooezrtgikxeqljnsjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989522.114351-1089-126560037211712/AnsiballZ_stat.py'
Nov 24 13:05:22 compute-1 sudo[143792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:22 compute-1 python3.9[143794]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:22 compute-1 sudo[143792]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:23 compute-1 sudo[143917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhypvoamgqodoavdsifyvvlfybtkggzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989522.114351-1089-126560037211712/AnsiballZ_copy.py'
Nov 24 13:05:23 compute-1 sudo[143917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:23 compute-1 python3.9[143919]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763989522.114351-1089-126560037211712/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:23 compute-1 sudo[143917]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:23 compute-1 sudo[144069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgbqbncukquumawsisklgvwwmudhindz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989523.4178169-1089-79838126715523/AnsiballZ_stat.py'
Nov 24 13:05:23 compute-1 sudo[144069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:23 compute-1 python3.9[144071]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:23 compute-1 sudo[144069]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:24 compute-1 sudo[144192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpiztkrdhijekkighksygkimdmjicovh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989523.4178169-1089-79838126715523/AnsiballZ_copy.py'
Nov 24 13:05:24 compute-1 sudo[144192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:24 compute-1 python3.9[144194]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763989523.4178169-1089-79838126715523/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:24 compute-1 sudo[144192]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:24 compute-1 sudo[144344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeymsupqtqbxhqskguuwinzsazjkugpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989524.6692252-1089-62972300716690/AnsiballZ_stat.py'
Nov 24 13:05:24 compute-1 sudo[144344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:25 compute-1 python3.9[144346]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:25 compute-1 sudo[144344]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:25 compute-1 sudo[144469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdtxeocjeucqdztpxdhukezzpdapndve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989524.6692252-1089-62972300716690/AnsiballZ_copy.py'
Nov 24 13:05:25 compute-1 sudo[144469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:25 compute-1 python3.9[144471]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763989524.6692252-1089-62972300716690/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:25 compute-1 sudo[144469]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:27 compute-1 sudo[144621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaqtebqshiazlyynlkwktzzilmcjuziy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989527.0130682-1315-106326169451678/AnsiballZ_command.py'
Nov 24 13:05:27 compute-1 sudo[144621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:27 compute-1 python3.9[144623]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 24 13:05:27 compute-1 sudo[144621]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:28 compute-1 sudo[144774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-humdwlfninjketachpqdeyqontjdswtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989527.9815683-1333-101971486807557/AnsiballZ_file.py'
Nov 24 13:05:28 compute-1 sudo[144774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:28 compute-1 python3.9[144776]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:28 compute-1 sudo[144774]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:28 compute-1 sudo[144926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdonzcxxujtjkpbgkmeagqhjsvzldwbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989528.6345313-1333-30783655824863/AnsiballZ_file.py'
Nov 24 13:05:28 compute-1 sudo[144926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:29 compute-1 python3.9[144928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:29 compute-1 sudo[144926]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:29 compute-1 sudo[145078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpzwskdchxpzkzdhaxxhjaonowagldbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989529.3682203-1333-6975633949950/AnsiballZ_file.py'
Nov 24 13:05:29 compute-1 sudo[145078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:29 compute-1 python3.9[145080]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:29 compute-1 sudo[145078]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:30 compute-1 sudo[145230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppxthmpbbfbvpzonsqeqxchohrwqeost ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989530.0512-1333-49924219993667/AnsiballZ_file.py'
Nov 24 13:05:30 compute-1 sudo[145230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:30 compute-1 python3.9[145232]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:30 compute-1 sudo[145230]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:31 compute-1 sudo[145382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmlstpdufmuhteeqsjndiamayavorpjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989530.6931856-1333-86266168281809/AnsiballZ_file.py'
Nov 24 13:05:31 compute-1 sudo[145382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:31 compute-1 python3.9[145384]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:31 compute-1 sudo[145382]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:31 compute-1 sudo[145534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oloedekbkttbqthnedobcudtlgfntvgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989531.3412993-1333-87808828881604/AnsiballZ_file.py'
Nov 24 13:05:31 compute-1 sudo[145534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:31 compute-1 python3.9[145536]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:31 compute-1 sudo[145534]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:32 compute-1 sudo[145686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeivfezobfkdkaeokwyfzrrawhxklqli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989532.0140967-1333-170124962188803/AnsiballZ_file.py'
Nov 24 13:05:32 compute-1 sudo[145686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:32 compute-1 python3.9[145688]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:32 compute-1 sudo[145686]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:32 compute-1 sudo[145840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgumxhyxhdswktbaxisncpmxnevutkbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989532.6433623-1333-216697337433050/AnsiballZ_file.py'
Nov 24 13:05:32 compute-1 sudo[145840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:33 compute-1 python3.9[145842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:33 compute-1 sudo[145840]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:33 compute-1 sudo[145992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgoxkxikoamcwmgoxaewpubgpbjjaavw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989533.2692323-1333-89208039471726/AnsiballZ_file.py'
Nov 24 13:05:33 compute-1 sudo[145992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:33 compute-1 python3.9[145994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:33 compute-1 sudo[145992]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:33 compute-1 sshd-session[145713]: Invalid user in from 175.100.24.139 port 55990
Nov 24 13:05:34 compute-1 sudo[146144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqnxwslmrbshohruchoxncbgoeiviqyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989533.9028316-1333-242487839801314/AnsiballZ_file.py'
Nov 24 13:05:34 compute-1 sudo[146144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:34 compute-1 sshd-session[145713]: Received disconnect from 175.100.24.139 port 55990:11: Bye Bye [preauth]
Nov 24 13:05:34 compute-1 sshd-session[145713]: Disconnected from invalid user in 175.100.24.139 port 55990 [preauth]
Nov 24 13:05:34 compute-1 python3.9[146146]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:34 compute-1 sudo[146144]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:34 compute-1 sudo[146296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvwbhvgwrqrxqydjbctlagmvugokowmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989534.4770906-1333-20224969468748/AnsiballZ_file.py'
Nov 24 13:05:34 compute-1 sudo[146296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:34 compute-1 python3.9[146298]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:35 compute-1 sudo[146296]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:35 compute-1 sudo[146448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-picljwtwpbfpiaebphrsgxibrahhoqcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989535.1610274-1333-196349216525632/AnsiballZ_file.py'
Nov 24 13:05:35 compute-1 sudo[146448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:35 compute-1 python3.9[146450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:35 compute-1 sudo[146448]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:36 compute-1 sudo[146600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcwrasdwrcvvrnssrbzbbgridkbaqngq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989535.7491221-1333-118187701659112/AnsiballZ_file.py'
Nov 24 13:05:36 compute-1 sudo[146600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:36 compute-1 python3.9[146602]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:36 compute-1 sudo[146600]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:36 compute-1 sudo[146752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccdyzfqettoqxoqivnhcgeqlnzjjyiye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989536.3347611-1333-52283456006356/AnsiballZ_file.py'
Nov 24 13:05:36 compute-1 sudo[146752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:36 compute-1 podman[146754]: 2025-11-24 13:05:36.731894412 +0000 UTC m=+0.055573908 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 24 13:05:36 compute-1 python3.9[146755]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:36 compute-1 sudo[146752]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:38 compute-1 sudo[146923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyrnmyuutqvoowyswsbqssdwqxspwbmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989537.9761772-1531-201231633136639/AnsiballZ_stat.py'
Nov 24 13:05:38 compute-1 sudo[146923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:38 compute-1 python3.9[146925]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:38 compute-1 sudo[146923]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:38 compute-1 sudo[147046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sipgqksitvhyaqplwbhishscdojojkph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989537.9761772-1531-201231633136639/AnsiballZ_copy.py'
Nov 24 13:05:38 compute-1 sudo[147046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:39 compute-1 python3.9[147048]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989537.9761772-1531-201231633136639/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:39 compute-1 sudo[147046]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:39 compute-1 podman[147154]: 2025-11-24 13:05:39.515626749 +0000 UTC m=+0.066192594 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:05:39 compute-1 sudo[147225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlhkjqglatjmsqxarpfqlobmxsqtmrgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989539.2808945-1531-255425107508784/AnsiballZ_stat.py'
Nov 24 13:05:39 compute-1 sudo[147225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:39 compute-1 python3.9[147228]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:39 compute-1 sudo[147225]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:40 compute-1 sudo[147349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoeefciilkjvqnxkckqtxvybzifnubzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989539.2808945-1531-255425107508784/AnsiballZ_copy.py'
Nov 24 13:05:40 compute-1 sudo[147349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:40 compute-1 python3.9[147351]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989539.2808945-1531-255425107508784/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:40 compute-1 sudo[147349]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:40 compute-1 sudo[147501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-satqbkhzruobrcpfhmwrklqcmfyuwlks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989540.3492239-1531-228553574840423/AnsiballZ_stat.py'
Nov 24 13:05:40 compute-1 sudo[147501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:40 compute-1 python3.9[147503]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:40 compute-1 sudo[147501]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:41 compute-1 sudo[147624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyumsnswarwfzldumlbshxdakyxckkbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989540.3492239-1531-228553574840423/AnsiballZ_copy.py'
Nov 24 13:05:41 compute-1 sudo[147624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:41 compute-1 python3.9[147626]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989540.3492239-1531-228553574840423/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:41 compute-1 sudo[147624]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:41 compute-1 sudo[147776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruxczhvzldswsowfbgxlskxucrrzmsam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989541.6969805-1531-29605498610198/AnsiballZ_stat.py'
Nov 24 13:05:41 compute-1 sudo[147776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:42 compute-1 python3.9[147778]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:42 compute-1 sudo[147776]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:42 compute-1 sudo[147899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeplwcoajulhsoxemxqiflcmeiifaqwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989541.6969805-1531-29605498610198/AnsiballZ_copy.py'
Nov 24 13:05:42 compute-1 sudo[147899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:42 compute-1 python3.9[147901]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989541.6969805-1531-29605498610198/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:42 compute-1 sudo[147899]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:43 compute-1 sudo[148051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozhlzmzsyluvhsiicbibqjnobypokkba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989542.902809-1531-36726804257120/AnsiballZ_stat.py'
Nov 24 13:05:43 compute-1 sudo[148051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:43 compute-1 python3.9[148053]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:43 compute-1 sudo[148051]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:43 compute-1 sudo[148176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsupytpksbazkyrxoxdcrzrlyziihqvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989542.902809-1531-36726804257120/AnsiballZ_copy.py'
Nov 24 13:05:43 compute-1 sudo[148176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:44 compute-1 python3.9[148178]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989542.902809-1531-36726804257120/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:44 compute-1 sudo[148176]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:44 compute-1 sudo[148328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsufaxdkjwaaekamgljrocdkieuvwbvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989544.2369688-1531-163908146707903/AnsiballZ_stat.py'
Nov 24 13:05:44 compute-1 sudo[148328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:44 compute-1 sshd-session[148054]: Invalid user supermaint from 45.78.194.40 port 49868
Nov 24 13:05:44 compute-1 python3.9[148330]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:44 compute-1 sudo[148328]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:44 compute-1 sshd-session[148054]: Received disconnect from 45.78.194.40 port 49868:11: Bye Bye [preauth]
Nov 24 13:05:44 compute-1 sshd-session[148054]: Disconnected from invalid user supermaint 45.78.194.40 port 49868 [preauth]
Nov 24 13:05:45 compute-1 sudo[148451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gogbxsvukpoiqgcbnjrxmjefwszkofob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989544.2369688-1531-163908146707903/AnsiballZ_copy.py'
Nov 24 13:05:45 compute-1 sudo[148451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:45 compute-1 python3.9[148453]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989544.2369688-1531-163908146707903/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:45 compute-1 sudo[148451]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:46 compute-1 sudo[148603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-genpwksmlneuquyamwcchleocmjkojjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989545.641992-1531-252239773857388/AnsiballZ_stat.py'
Nov 24 13:05:46 compute-1 sudo[148603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:46 compute-1 python3.9[148605]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:46 compute-1 sudo[148603]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:46 compute-1 sudo[148726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmxwazftpjmyfcqondrwfhkelmrpgonj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989545.641992-1531-252239773857388/AnsiballZ_copy.py'
Nov 24 13:05:46 compute-1 sudo[148726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:46 compute-1 python3.9[148728]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989545.641992-1531-252239773857388/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:46 compute-1 sudo[148726]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:47 compute-1 sudo[148878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptnyjplbnjnedhaeheemmqkdnupmypwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989547.0297487-1531-212526743558019/AnsiballZ_stat.py'
Nov 24 13:05:47 compute-1 sudo[148878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:47 compute-1 python3.9[148880]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:47 compute-1 sudo[148878]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:48 compute-1 sudo[149001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwcblxmokosvxzymlrlbezawekoxwhsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989547.0297487-1531-212526743558019/AnsiballZ_copy.py'
Nov 24 13:05:48 compute-1 sudo[149001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:48 compute-1 python3.9[149003]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989547.0297487-1531-212526743558019/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:48 compute-1 sudo[149001]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:48 compute-1 sudo[149153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftiixpqbzminmmenrgevzudeutgvcgdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989548.4270139-1531-184265356131316/AnsiballZ_stat.py'
Nov 24 13:05:48 compute-1 sudo[149153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:48 compute-1 python3.9[149155]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:48 compute-1 sudo[149153]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:49 compute-1 sudo[149276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwemeibtcrouaewrhvzuydenczyasces ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989548.4270139-1531-184265356131316/AnsiballZ_copy.py'
Nov 24 13:05:49 compute-1 sudo[149276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:49 compute-1 python3.9[149278]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989548.4270139-1531-184265356131316/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:49 compute-1 sudo[149276]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:50 compute-1 sudo[149428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uonznbgpqhjojbiqifzrjikzgxjnqdst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989549.7716668-1531-80530608055738/AnsiballZ_stat.py'
Nov 24 13:05:50 compute-1 sudo[149428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:50 compute-1 python3.9[149430]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:50 compute-1 sudo[149428]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:50 compute-1 sudo[149551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzbzlvenrabhfsooqckzmwlizdbyyczx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989549.7716668-1531-80530608055738/AnsiballZ_copy.py'
Nov 24 13:05:50 compute-1 sudo[149551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:50 compute-1 python3.9[149553]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989549.7716668-1531-80530608055738/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:50 compute-1 sudo[149551]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:51 compute-1 sudo[149703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diytjkodlodngwerqzontgqhcsccsktu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989551.007264-1531-66292559755895/AnsiballZ_stat.py'
Nov 24 13:05:51 compute-1 sudo[149703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:51 compute-1 python3.9[149705]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:51 compute-1 sudo[149703]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:51 compute-1 sudo[149826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrlffwshxpphmwshuxvugxwdkralbgpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989551.007264-1531-66292559755895/AnsiballZ_copy.py'
Nov 24 13:05:51 compute-1 sudo[149826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:52 compute-1 python3.9[149828]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989551.007264-1531-66292559755895/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:52 compute-1 sudo[149826]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:52 compute-1 sudo[149978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urigobubpuqjioatdqetvlpzqxyodqip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989552.1540694-1531-139974866568713/AnsiballZ_stat.py'
Nov 24 13:05:52 compute-1 sudo[149978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:52 compute-1 python3.9[149980]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:52 compute-1 sudo[149978]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:52 compute-1 sudo[150101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyanhtwbxpfjhgjaqmhkryqqdfjlzqvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989552.1540694-1531-139974866568713/AnsiballZ_copy.py'
Nov 24 13:05:52 compute-1 sudo[150101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:53 compute-1 python3.9[150103]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989552.1540694-1531-139974866568713/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:53 compute-1 sudo[150101]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:53 compute-1 sudo[150254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hklrmanmztjkeeypdojnypumempphphd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989553.2702856-1531-234632428183576/AnsiballZ_stat.py'
Nov 24 13:05:53 compute-1 sudo[150254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:53 compute-1 python3.9[150256]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:53 compute-1 sudo[150254]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:54 compute-1 sudo[150377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sikevobaoigmilmejnaovufruxetkmxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989553.2702856-1531-234632428183576/AnsiballZ_copy.py'
Nov 24 13:05:54 compute-1 sudo[150377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:54 compute-1 python3.9[150379]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989553.2702856-1531-234632428183576/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:54 compute-1 sudo[150377]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:54 compute-1 sudo[150529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqqsxbhntjmxjbdavndjbhspgcrakrfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989554.4192214-1531-241610617108553/AnsiballZ_stat.py'
Nov 24 13:05:54 compute-1 sudo[150529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:54 compute-1 python3.9[150531]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:05:55 compute-1 sudo[150529]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:55 compute-1 sudo[150652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uktyzrdccdqxxpjefsazhtpmdggoqhxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989554.4192214-1531-241610617108553/AnsiballZ_copy.py'
Nov 24 13:05:55 compute-1 sudo[150652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:55 compute-1 python3.9[150654]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989554.4192214-1531-241610617108553/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:05:55 compute-1 sudo[150652]: pam_unix(sudo:session): session closed for user root
Nov 24 13:05:57 compute-1 python3.9[150804]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:05:58 compute-1 sudo[150957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndqbjavbcjtfgxugrtfbkfthqbvbsvkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989558.1317613-1943-106971337600707/AnsiballZ_seboolean.py'
Nov 24 13:05:58 compute-1 sudo[150957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:05:58 compute-1 python3.9[150959]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 24 13:05:59 compute-1 sudo[150957]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:00 compute-1 sudo[151113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flkjvmaxhlmjioepexkmwvsfcpfphfdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989560.1583264-1959-221742203138514/AnsiballZ_copy.py'
Nov 24 13:06:00 compute-1 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 24 13:06:00 compute-1 sudo[151113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:00 compute-1 python3.9[151115]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:00 compute-1 sudo[151113]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:01 compute-1 sudo[151265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgzfgditfuedcdvggapishmtxaakkzmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989560.8199978-1959-176284249166146/AnsiballZ_copy.py'
Nov 24 13:06:01 compute-1 sudo[151265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:01 compute-1 python3.9[151267]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:01 compute-1 sudo[151265]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:01 compute-1 sudo[151419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwbtbhppngwytjzrefpbrhyhdsfxtghg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989561.4463713-1959-95203587970881/AnsiballZ_copy.py'
Nov 24 13:06:01 compute-1 sudo[151419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:01 compute-1 python3.9[151421]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:01 compute-1 sudo[151419]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:02 compute-1 sudo[151571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmneexmspkugylzpdceudhozcatctoso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989562.0922453-1959-263544099621543/AnsiballZ_copy.py'
Nov 24 13:06:02 compute-1 sudo[151571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:02 compute-1 python3.9[151573]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:02 compute-1 sudo[151571]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:02 compute-1 sshd-session[151268]: Received disconnect from 68.183.82.237 port 36022:11: Bye Bye [preauth]
Nov 24 13:06:02 compute-1 sshd-session[151268]: Disconnected from authenticating user root 68.183.82.237 port 36022 [preauth]
Nov 24 13:06:02 compute-1 sudo[151723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrsdzzvuuyvpkxuxxqlynlumdmvtpyxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989562.6816812-1959-258419358925921/AnsiballZ_copy.py'
Nov 24 13:06:02 compute-1 sudo[151723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:03 compute-1 python3.9[151725]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:03 compute-1 sudo[151723]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:03 compute-1 sshd-session[150104]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:06:03 compute-1 sshd-session[150104]: banner exchange: Connection from 115.190.107.104 port 37910: Connection timed out
Nov 24 13:06:03 compute-1 sudo[151875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zueoezuqhytxhmvhrsxxroqmwenxgxbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989563.7186532-2031-261035841359204/AnsiballZ_copy.py'
Nov 24 13:06:04 compute-1 sudo[151875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:06:04.130 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:06:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:06:04.132 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:06:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:06:04.132 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:06:04 compute-1 python3.9[151877]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:04 compute-1 sudo[151875]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:04 compute-1 sudo[152027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqzxohqfltratmyoppkauadpxdlalulb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989564.343094-2031-154952488890905/AnsiballZ_copy.py'
Nov 24 13:06:04 compute-1 sudo[152027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:04 compute-1 python3.9[152029]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:04 compute-1 sudo[152027]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:05 compute-1 sudo[152179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiraogayiogcbjwnlrthbejrtscfkreu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989564.996405-2031-41289375066450/AnsiballZ_copy.py'
Nov 24 13:06:05 compute-1 sudo[152179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:05 compute-1 python3.9[152181]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:05 compute-1 sudo[152179]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:05 compute-1 sshd-session[152235]: Received disconnect from 85.209.134.43 port 35478:11: Bye Bye [preauth]
Nov 24 13:06:05 compute-1 sshd-session[152235]: Disconnected from authenticating user root 85.209.134.43 port 35478 [preauth]
Nov 24 13:06:05 compute-1 sudo[152333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvujnciczptljqcurrgueuhbqkzcverc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989565.6121805-2031-19349440841218/AnsiballZ_copy.py'
Nov 24 13:06:05 compute-1 sudo[152333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:06 compute-1 python3.9[152335]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:06 compute-1 sudo[152333]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:06 compute-1 sudo[152485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulaspappehznvmybwkejdigskzdstgie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989566.1802528-2031-55549932508997/AnsiballZ_copy.py'
Nov 24 13:06:06 compute-1 sudo[152485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:06 compute-1 python3.9[152487]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:06 compute-1 sudo[152485]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:07 compute-1 sudo[152651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnvixjfftukjyzisdedsfvfguqykdoxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989567.1564941-2103-15435066863720/AnsiballZ_systemd.py'
Nov 24 13:06:07 compute-1 sudo[152651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:07 compute-1 podman[152604]: 2025-11-24 13:06:07.515608781 +0000 UTC m=+0.052704250 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:06:07 compute-1 python3.9[152658]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:06:07 compute-1 systemd[1]: Reloading.
Nov 24 13:06:07 compute-1 systemd-rc-local-generator[152685]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:06:07 compute-1 systemd-sysv-generator[152689]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:06:08 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Nov 24 13:06:08 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Nov 24 13:06:08 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 24 13:06:08 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 24 13:06:08 compute-1 systemd[1]: Starting libvirt logging daemon...
Nov 24 13:06:08 compute-1 systemd[1]: Started libvirt logging daemon.
Nov 24 13:06:08 compute-1 sudo[152651]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:08 compute-1 sudo[152849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugbstvktumspwmbyjuhrjfxponvnbwbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989568.2673132-2103-39090332436605/AnsiballZ_systemd.py'
Nov 24 13:06:08 compute-1 sudo[152849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:08 compute-1 python3.9[152851]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:06:08 compute-1 systemd[1]: Reloading.
Nov 24 13:06:08 compute-1 systemd-rc-local-generator[152877]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:06:08 compute-1 systemd-sysv-generator[152880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:06:09 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 24 13:06:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 24 13:06:09 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 24 13:06:09 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 24 13:06:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 24 13:06:09 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 24 13:06:09 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Nov 24 13:06:09 compute-1 systemd[1]: Started libvirt nodedev daemon.
Nov 24 13:06:09 compute-1 sudo[152849]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:09 compute-1 sudo[153066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjmujztobtgjqmgnakzhyvszwybjkxgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989569.2775178-2103-84172445459402/AnsiballZ_systemd.py'
Nov 24 13:06:09 compute-1 sudo[153066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:09 compute-1 python3.9[153068]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:06:09 compute-1 systemd[1]: Reloading.
Nov 24 13:06:09 compute-1 systemd-sysv-generator[153117]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:06:09 compute-1 systemd-rc-local-generator[153114]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:06:09 compute-1 podman[153070]: 2025-11-24 13:06:09.967696463 +0000 UTC m=+0.087280832 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:06:10 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 24 13:06:10 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 24 13:06:10 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 24 13:06:10 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 24 13:06:10 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 24 13:06:10 compute-1 systemd[1]: Starting libvirt proxy daemon...
Nov 24 13:06:10 compute-1 systemd[1]: Started libvirt proxy daemon.
Nov 24 13:06:10 compute-1 sudo[153066]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:10 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 24 13:06:10 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 24 13:06:10 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 24 13:06:10 compute-1 sudo[153308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaetysysifflqorfeikidfdryumcuvct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989570.38807-2103-235929551875191/AnsiballZ_systemd.py'
Nov 24 13:06:10 compute-1 sudo[153308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:10 compute-1 python3.9[153314]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:06:10 compute-1 systemd[1]: Reloading.
Nov 24 13:06:11 compute-1 systemd-rc-local-generator[153340]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:06:11 compute-1 systemd-sysv-generator[153343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:06:11 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Nov 24 13:06:11 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 24 13:06:11 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 24 13:06:11 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 24 13:06:11 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 24 13:06:11 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 24 13:06:11 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 24 13:06:11 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 24 13:06:11 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 24 13:06:11 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 24 13:06:11 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Nov 24 13:06:11 compute-1 systemd[1]: Started libvirt QEMU daemon.
Nov 24 13:06:11 compute-1 sudo[153308]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:11 compute-1 setroubleshoot[153131]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 80d82833-37eb-4c48-93ea-bff055d75572
Nov 24 13:06:11 compute-1 setroubleshoot[153131]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 24 13:06:11 compute-1 setroubleshoot[153131]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 80d82833-37eb-4c48-93ea-bff055d75572
Nov 24 13:06:11 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 13:06:11 compute-1 setroubleshoot[153131]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 24 13:06:11 compute-1 sudo[153530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awjbwyrspmpfbtgswfeujojhvprvotqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989571.6120107-2103-241631839867926/AnsiballZ_systemd.py'
Nov 24 13:06:11 compute-1 sudo[153530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:12 compute-1 python3.9[153532]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:06:12 compute-1 systemd[1]: Reloading.
Nov 24 13:06:12 compute-1 systemd-rc-local-generator[153558]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:06:12 compute-1 systemd-sysv-generator[153561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:06:12 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Nov 24 13:06:12 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Nov 24 13:06:12 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 24 13:06:12 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 24 13:06:12 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 24 13:06:12 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 24 13:06:12 compute-1 systemd[1]: Starting libvirt secret daemon...
Nov 24 13:06:12 compute-1 systemd[1]: Started libvirt secret daemon.
Nov 24 13:06:12 compute-1 sudo[153530]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:14 compute-1 sudo[153741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leuhmfplvyipentsikkrsywdwiuahsdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989573.7739167-2177-99151424802032/AnsiballZ_file.py'
Nov 24 13:06:14 compute-1 sudo[153741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:14 compute-1 python3.9[153743]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:14 compute-1 sudo[153741]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:14 compute-1 sudo[153893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbqsrjcvdkgetwpgpphushhctcusaujf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989574.6343806-2193-84840860026038/AnsiballZ_find.py'
Nov 24 13:06:14 compute-1 sudo[153893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:15 compute-1 python3.9[153895]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 13:06:15 compute-1 sudo[153893]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:16 compute-1 sudo[154045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frtzcaibyozdskdmeekgqjmjsztejjqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989575.6934211-2221-265348995890570/AnsiballZ_stat.py'
Nov 24 13:06:16 compute-1 sudo[154045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:16 compute-1 python3.9[154047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:16 compute-1 sudo[154045]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:16 compute-1 sudo[154168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pworhjhuonviajlxlldkdbqioxiuwnll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989575.6934211-2221-265348995890570/AnsiballZ_copy.py'
Nov 24 13:06:16 compute-1 sudo[154168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:16 compute-1 python3.9[154170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989575.6934211-2221-265348995890570/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:16 compute-1 sudo[154168]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:17 compute-1 sudo[154320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fefywoozwzydmzijaeavfxbwydlrkrlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989577.305481-2253-133034308509388/AnsiballZ_file.py'
Nov 24 13:06:17 compute-1 sudo[154320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:17 compute-1 python3.9[154322]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:17 compute-1 sudo[154320]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:18 compute-1 sudo[154472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsyekygdlykhhsxsdyegoqtjjmqbmdwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989578.0660923-2269-133640651810088/AnsiballZ_stat.py'
Nov 24 13:06:18 compute-1 sudo[154472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:18 compute-1 python3.9[154474]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:18 compute-1 sudo[154472]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:18 compute-1 sudo[154550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbwxtjyxttoiydjnbjzajvenjifxlwxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989578.0660923-2269-133640651810088/AnsiballZ_file.py'
Nov 24 13:06:18 compute-1 sudo[154550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:19 compute-1 python3.9[154552]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:19 compute-1 sudo[154550]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:19 compute-1 sudo[154702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnpxolfrecmpvebtxvaaeasqthbknqof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989579.448273-2293-41276083894957/AnsiballZ_stat.py'
Nov 24 13:06:19 compute-1 sudo[154702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:19 compute-1 python3.9[154704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:19 compute-1 sudo[154702]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:20 compute-1 sudo[154780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krnevnuixfkrhgmqilmbwdeiwduejvkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989579.448273-2293-41276083894957/AnsiballZ_file.py'
Nov 24 13:06:20 compute-1 sudo[154780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:20 compute-1 python3.9[154782]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.a81cuzie recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:20 compute-1 sudo[154780]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:20 compute-1 sudo[154932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgxedyvegrlooxtcerwtnpokyzkwtvaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989580.6865673-2318-105182782615568/AnsiballZ_stat.py'
Nov 24 13:06:20 compute-1 sudo[154932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:21 compute-1 python3.9[154934]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:21 compute-1 sudo[154932]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:21 compute-1 sshd-session[153279]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:06:21 compute-1 sshd-session[153279]: banner exchange: Connection from 115.190.105.137 port 43604: Connection timed out
Nov 24 13:06:21 compute-1 sudo[155010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkyqejobybuvrvxbzjmpyjbbevzngaoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989580.6865673-2318-105182782615568/AnsiballZ_file.py'
Nov 24 13:06:21 compute-1 sudo[155010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:21 compute-1 python3.9[155012]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:21 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 24 13:06:21 compute-1 sudo[155010]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:21 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 24 13:06:22 compute-1 sudo[155162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkqphcryifvuchspmnpdrmmfnadfejpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989581.946141-2343-60910980500477/AnsiballZ_command.py'
Nov 24 13:06:22 compute-1 sudo[155162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:22 compute-1 python3.9[155164]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:06:22 compute-1 sudo[155162]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:23 compute-1 sudo[155315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnzanvylblbzhfezyqzghvbqaondnhnk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989582.8609204-2359-80420508143738/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 13:06:23 compute-1 sudo[155315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:23 compute-1 python3[155317]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 13:06:23 compute-1 sudo[155315]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:24 compute-1 sudo[155467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnubfnkqpjeyjeywfjolgiookrifdsgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989583.7976387-2375-52479153717949/AnsiballZ_stat.py'
Nov 24 13:06:24 compute-1 sudo[155467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:24 compute-1 python3.9[155469]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:24 compute-1 sudo[155467]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:24 compute-1 sudo[155545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfzkriqezkzjfdzsbthaynbfnkhzhotq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989583.7976387-2375-52479153717949/AnsiballZ_file.py'
Nov 24 13:06:24 compute-1 sudo[155545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:24 compute-1 python3.9[155547]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:24 compute-1 sudo[155545]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:25 compute-1 sudo[155699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znbfendkrwpyvrjihvbnymzfvpufvvma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989585.0913079-2399-180060984270320/AnsiballZ_stat.py'
Nov 24 13:06:25 compute-1 sudo[155699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:25 compute-1 python3.9[155701]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:25 compute-1 sudo[155699]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:25 compute-1 sudo[155777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izgrobcygdngioyskewkqsngmnjzuldb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989585.0913079-2399-180060984270320/AnsiballZ_file.py'
Nov 24 13:06:25 compute-1 sudo[155777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:26 compute-1 sshd-session[155647]: Invalid user stperez from 176.114.89.34 port 53330
Nov 24 13:06:26 compute-1 python3.9[155779]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:26 compute-1 sudo[155777]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:26 compute-1 sshd-session[155647]: Received disconnect from 176.114.89.34 port 53330:11: Bye Bye [preauth]
Nov 24 13:06:26 compute-1 sshd-session[155647]: Disconnected from invalid user stperez 176.114.89.34 port 53330 [preauth]
Nov 24 13:06:26 compute-1 sudo[155929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcifxbhuznvamqxnxdmiacseubbnjqpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989586.54586-2423-158602328558349/AnsiballZ_stat.py'
Nov 24 13:06:26 compute-1 sudo[155929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:27 compute-1 python3.9[155931]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:27 compute-1 sudo[155929]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:27 compute-1 sudo[156007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjzrwjrguyotycjkrcefkuiopopjnpcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989586.54586-2423-158602328558349/AnsiballZ_file.py'
Nov 24 13:06:27 compute-1 sudo[156007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:27 compute-1 python3.9[156009]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:27 compute-1 sudo[156007]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:28 compute-1 sudo[156159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxrudffujdmdonhorybvsicwjzfhfnnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989587.931404-2447-269776939013395/AnsiballZ_stat.py'
Nov 24 13:06:28 compute-1 sudo[156159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:28 compute-1 python3.9[156161]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:28 compute-1 sudo[156159]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:28 compute-1 sudo[156238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvzthyqnftbfsecfmbyzwfkaxexmnxnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989587.931404-2447-269776939013395/AnsiballZ_file.py'
Nov 24 13:06:28 compute-1 sudo[156238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:28 compute-1 python3.9[156240]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:28 compute-1 sudo[156238]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:29 compute-1 sudo[156391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgwbwbvsmytfowfispxyndwiagxdspmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989589.2218812-2471-57003049543302/AnsiballZ_stat.py'
Nov 24 13:06:29 compute-1 sudo[156391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:29 compute-1 python3.9[156393]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:29 compute-1 sudo[156391]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:30 compute-1 sudo[156516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thcnprujlalhsfrichspkmnvzildoxva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989589.2218812-2471-57003049543302/AnsiballZ_copy.py'
Nov 24 13:06:30 compute-1 sudo[156516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:30 compute-1 python3.9[156518]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989589.2218812-2471-57003049543302/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:30 compute-1 sudo[156516]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:31 compute-1 sudo[156668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbyrhefvbpszfjjhpkoysatosuinspcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989590.7458882-2501-264452636084712/AnsiballZ_file.py'
Nov 24 13:06:31 compute-1 sudo[156668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:31 compute-1 python3.9[156670]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:31 compute-1 sudo[156668]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:32 compute-1 sudo[156820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-parugrinwterzhyagrilqgzykfmboceu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989591.556263-2517-130037024989211/AnsiballZ_command.py'
Nov 24 13:06:32 compute-1 sudo[156820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:32 compute-1 python3.9[156822]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:06:32 compute-1 sshd-session[156164]: Connection closed by 45.78.217.131 port 41438 [preauth]
Nov 24 13:06:32 compute-1 sudo[156820]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:33 compute-1 sudo[156975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqsoddyxyowtfhskfmgbgdhdwnevybyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989592.5887532-2533-230713859036119/AnsiballZ_blockinfile.py'
Nov 24 13:06:33 compute-1 sudo[156975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:33 compute-1 python3.9[156977]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:33 compute-1 sudo[156975]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:33 compute-1 sudo[157127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xocoayzxjjfajazfgdctfqomkaxxlnnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989593.6706014-2551-106934473555676/AnsiballZ_command.py'
Nov 24 13:06:33 compute-1 sudo[157127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:34 compute-1 python3.9[157129]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:06:34 compute-1 sudo[157127]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:34 compute-1 sudo[157281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-susownjbjerqgdyhrklimawrztkzhugg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989594.3903496-2567-211902237811948/AnsiballZ_stat.py'
Nov 24 13:06:34 compute-1 sudo[157281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:34 compute-1 python3.9[157283]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:06:34 compute-1 sudo[157281]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:35 compute-1 sudo[157435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajitkjciwfwodstkbmoxnkgmrvpiyjmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989595.2224104-2583-260351102033305/AnsiballZ_command.py'
Nov 24 13:06:35 compute-1 sudo[157435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:35 compute-1 python3.9[157437]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:06:35 compute-1 sudo[157435]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:36 compute-1 sudo[157590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrtrptgjlbmvcuesmllkvtqnegzvvnwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989595.9387834-2599-155635095367112/AnsiballZ_file.py'
Nov 24 13:06:36 compute-1 sudo[157590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:36 compute-1 python3.9[157592]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:36 compute-1 sudo[157590]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:37 compute-1 sudo[157742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idcjbyricrcysxjziznxtncgzjxcmeky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989596.698853-2615-236889274041270/AnsiballZ_stat.py'
Nov 24 13:06:37 compute-1 sudo[157742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:37 compute-1 python3.9[157744]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:37 compute-1 sudo[157742]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:37 compute-1 sudo[157865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxhpxlxmaupohnpfblaejjvgnuacolnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989596.698853-2615-236889274041270/AnsiballZ_copy.py'
Nov 24 13:06:37 compute-1 sudo[157865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:37 compute-1 podman[157867]: 2025-11-24 13:06:37.674195506 +0000 UTC m=+0.068876658 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 24 13:06:37 compute-1 python3.9[157868]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989596.698853-2615-236889274041270/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:37 compute-1 sudo[157865]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:38 compute-1 sudo[158037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldoewglqceznhfgtsosqirukkqdgcijv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989598.093559-2646-275485230256439/AnsiballZ_stat.py'
Nov 24 13:06:38 compute-1 sudo[158037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:38 compute-1 python3.9[158039]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:38 compute-1 sudo[158037]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:38 compute-1 sudo[158160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atrrsuxbnfwxkxefbjlnodiqitxfhdxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989598.093559-2646-275485230256439/AnsiballZ_copy.py'
Nov 24 13:06:38 compute-1 sudo[158160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:39 compute-1 python3.9[158162]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989598.093559-2646-275485230256439/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:39 compute-1 sudo[158160]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:39 compute-1 sudo[158312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdvjzzucucgbgwwwojklkgvpkyyydcfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989599.5077932-2676-247500594643656/AnsiballZ_stat.py'
Nov 24 13:06:39 compute-1 sudo[158312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:39 compute-1 python3.9[158314]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:06:39 compute-1 sudo[158312]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:40 compute-1 sudo[158451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjnkawtrajmjggbkmnajeffawlgcargv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989599.5077932-2676-247500594643656/AnsiballZ_copy.py'
Nov 24 13:06:40 compute-1 sudo[158451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:40 compute-1 podman[158409]: 2025-11-24 13:06:40.374532996 +0000 UTC m=+0.097905749 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:06:40 compute-1 python3.9[158456]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989599.5077932-2676-247500594643656/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:06:40 compute-1 sudo[158451]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:41 compute-1 sudo[158612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvlidrzzpqweyrxknhaenvhqvqsjgcih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989600.8055432-2705-216347440496166/AnsiballZ_systemd.py'
Nov 24 13:06:41 compute-1 sudo[158612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:41 compute-1 python3.9[158614]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:06:41 compute-1 systemd[1]: Reloading.
Nov 24 13:06:41 compute-1 systemd-sysv-generator[158646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:06:41 compute-1 systemd-rc-local-generator[158639]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:06:41 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Nov 24 13:06:41 compute-1 sudo[158612]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:42 compute-1 sudo[158804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyzbonlqpwlpuazljwqqpzsuovhuvpys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989602.082313-2721-168694835205137/AnsiballZ_systemd.py'
Nov 24 13:06:42 compute-1 sudo[158804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:42 compute-1 python3.9[158806]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 24 13:06:42 compute-1 systemd[1]: Reloading.
Nov 24 13:06:42 compute-1 systemd-sysv-generator[158837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:06:42 compute-1 systemd-rc-local-generator[158833]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:06:43 compute-1 systemd[1]: Reloading.
Nov 24 13:06:44 compute-1 systemd-rc-local-generator[158872]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:06:44 compute-1 systemd-sysv-generator[158876]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:06:44 compute-1 sudo[158804]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:44 compute-1 sshd-session[104371]: Connection closed by 192.168.122.30 port 40590
Nov 24 13:06:44 compute-1 sshd-session[104368]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:06:44 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Nov 24 13:06:44 compute-1 systemd[1]: session-24.scope: Consumed 3min 18.805s CPU time.
Nov 24 13:06:44 compute-1 systemd-logind[815]: Session 24 logged out. Waiting for processes to exit.
Nov 24 13:06:44 compute-1 systemd-logind[815]: Removed session 24.
Nov 24 13:06:50 compute-1 sshd-session[158904]: Accepted publickey for zuul from 192.168.122.30 port 35734 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:06:50 compute-1 systemd-logind[815]: New session 25 of user zuul.
Nov 24 13:06:50 compute-1 systemd[1]: Started Session 25 of User zuul.
Nov 24 13:06:50 compute-1 sshd-session[158904]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:06:51 compute-1 python3.9[159057]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:06:53 compute-1 python3.9[159211]: ansible-ansible.builtin.service_facts Invoked
Nov 24 13:06:53 compute-1 network[159228]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 13:06:53 compute-1 network[159229]: 'network-scripts' will be removed from distribution in near future.
Nov 24 13:06:53 compute-1 network[159230]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 13:06:58 compute-1 sshd-session[158903]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:06:58 compute-1 sshd-session[158903]: banner exchange: Connection from 218.56.160.82 port 31664: Connection timed out
Nov 24 13:06:58 compute-1 sudo[159499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzsygknmosbwunvmoiuugfjkmtamduxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989618.2096422-75-120544885096459/AnsiballZ_setup.py'
Nov 24 13:06:58 compute-1 sudo[159499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:58 compute-1 python3.9[159501]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 13:06:59 compute-1 sudo[159499]: pam_unix(sudo:session): session closed for user root
Nov 24 13:06:59 compute-1 sudo[159583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xatjausekynuflnvungloanolgiupwex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989618.2096422-75-120544885096459/AnsiballZ_dnf.py'
Nov 24 13:06:59 compute-1 sudo[159583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:06:59 compute-1 python3.9[159585]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 13:07:01 compute-1 sshd-session[159587]: Invalid user sol from 45.148.10.240 port 51432
Nov 24 13:07:02 compute-1 sshd-session[159587]: Connection closed by invalid user sol 45.148.10.240 port 51432 [preauth]
Nov 24 13:07:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:07:04.132 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:07:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:07:04.133 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:07:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:07:04.133 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:07:04 compute-1 sudo[159583]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:05 compute-1 sudo[159741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjqkcxzdzgetinucddqttzuchzoslcwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989625.1671958-99-248148687037084/AnsiballZ_stat.py'
Nov 24 13:07:05 compute-1 sudo[159741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:05 compute-1 python3.9[159743]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:07:05 compute-1 sudo[159741]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:05 compute-1 sshd-session[159589]: Invalid user rstudio from 175.100.24.139 port 58180
Nov 24 13:07:06 compute-1 sshd-session[159589]: Received disconnect from 175.100.24.139 port 58180:11: Bye Bye [preauth]
Nov 24 13:07:06 compute-1 sshd-session[159589]: Disconnected from invalid user rstudio 175.100.24.139 port 58180 [preauth]
Nov 24 13:07:06 compute-1 sudo[159893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skxchfmvgdkeijocejhwcubxwtqghauj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989626.1899962-119-27562451722601/AnsiballZ_command.py'
Nov 24 13:07:06 compute-1 sudo[159893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:06 compute-1 python3.9[159895]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:07:06 compute-1 sudo[159893]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:07 compute-1 sudo[160046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhwepwyrezymfincljnagoycyusihzwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989627.2959108-139-117995655229602/AnsiballZ_stat.py'
Nov 24 13:07:07 compute-1 sudo[160046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:07 compute-1 python3.9[160048]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:07:07 compute-1 sudo[160046]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:08 compute-1 sudo[160209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqmxkzvgcghlwmyqvxrmdqckiqabpsus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989627.9653418-155-111741038689874/AnsiballZ_command.py'
Nov 24 13:07:08 compute-1 sudo[160209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:08 compute-1 podman[160172]: 2025-11-24 13:07:08.328369094 +0000 UTC m=+0.096743748 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:07:08 compute-1 python3.9[160215]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:07:08 compute-1 sudo[160209]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:09 compute-1 sudo[160370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnfqteoueueikxlcrfmujarjdkpraouz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989629.183476-171-29268052563863/AnsiballZ_stat.py'
Nov 24 13:07:09 compute-1 sudo[160370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:09 compute-1 python3.9[160372]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:09 compute-1 sudo[160370]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:10 compute-1 sudo[160493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqgvujgxcethaoibrczavneqtpffpwed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989629.183476-171-29268052563863/AnsiballZ_copy.py'
Nov 24 13:07:10 compute-1 sudo[160493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:10 compute-1 python3.9[160495]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989629.183476-171-29268052563863/.source.iscsi _original_basename=.egki2kar follow=False checksum=4e7dd23e212ca55bcb5a21bbca7de34e703f19f2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:10 compute-1 sudo[160493]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:10 compute-1 podman[160509]: 2025-11-24 13:07:10.528896927 +0000 UTC m=+0.080575462 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 13:07:10 compute-1 sshd-session[160598]: Invalid user testuser from 85.209.134.43 port 48516
Nov 24 13:07:10 compute-1 sshd-session[160598]: Received disconnect from 85.209.134.43 port 48516:11: Bye Bye [preauth]
Nov 24 13:07:10 compute-1 sshd-session[160598]: Disconnected from invalid user testuser 85.209.134.43 port 48516 [preauth]
Nov 24 13:07:11 compute-1 sudo[160673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqxvvvaqauwprhqlxpihkmefchczsgwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989630.6008525-201-76628649743904/AnsiballZ_file.py'
Nov 24 13:07:11 compute-1 sudo[160673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:11 compute-1 python3.9[160675]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:11 compute-1 sudo[160673]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:11 compute-1 sudo[160825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruqeumdwqypgcnacggsovpxbmdvxvbhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989631.5336702-217-148692841354559/AnsiballZ_lineinfile.py'
Nov 24 13:07:11 compute-1 sudo[160825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:12 compute-1 python3.9[160827]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:12 compute-1 sudo[160825]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:13 compute-1 sudo[160977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejrrikqsvoyqqfivquurhbnvmdukcpqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989632.377348-235-86029474713150/AnsiballZ_systemd_service.py'
Nov 24 13:07:13 compute-1 sudo[160977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:13 compute-1 python3.9[160979]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:07:13 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 24 13:07:13 compute-1 sudo[160977]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:14 compute-1 sudo[161133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odkvneycxhkuyogozjefkcdwykrlimwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989633.7443955-251-222872290220369/AnsiballZ_systemd_service.py'
Nov 24 13:07:14 compute-1 sudo[161133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:14 compute-1 python3.9[161135]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:07:14 compute-1 systemd[1]: Reloading.
Nov 24 13:07:14 compute-1 systemd-sysv-generator[161163]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:07:14 compute-1 systemd-rc-local-generator[161160]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:07:14 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 24 13:07:14 compute-1 systemd[1]: Starting Open-iSCSI...
Nov 24 13:07:14 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Nov 24 13:07:14 compute-1 systemd[1]: Started Open-iSCSI.
Nov 24 13:07:14 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 24 13:07:14 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 24 13:07:14 compute-1 sudo[161133]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:15 compute-1 sshd-session[159591]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:07:15 compute-1 sshd-session[159591]: banner exchange: Connection from 218.56.160.82 port 28975: Connection timed out
Nov 24 13:07:15 compute-1 sudo[161333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feyectdqbipzwgttnnaiofmxsywgejzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989635.1779401-273-199921285282967/AnsiballZ_service_facts.py'
Nov 24 13:07:15 compute-1 sudo[161333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:15 compute-1 python3.9[161335]: ansible-ansible.builtin.service_facts Invoked
Nov 24 13:07:15 compute-1 network[161352]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 13:07:15 compute-1 network[161353]: 'network-scripts' will be removed from distribution in near future.
Nov 24 13:07:15 compute-1 network[161354]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 13:07:21 compute-1 sudo[161333]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:22 compute-1 sudo[161624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoxjftmtjkkykurcthqkhtmlrdxlaeag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989642.3447723-293-7311283855856/AnsiballZ_file.py'
Nov 24 13:07:22 compute-1 sudo[161624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:22 compute-1 python3.9[161626]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 13:07:22 compute-1 sudo[161624]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:23 compute-1 sudo[161776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyvvdaqfsnvbkxwndwpusnkiodaeavno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989643.145771-309-49689042298217/AnsiballZ_modprobe.py'
Nov 24 13:07:23 compute-1 sudo[161776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:23 compute-1 python3.9[161778]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 24 13:07:23 compute-1 sudo[161776]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:24 compute-1 sudo[161934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqcdlaszrkffuvpnosmhjyvrndegymmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989644.0852017-325-119922597793337/AnsiballZ_stat.py'
Nov 24 13:07:24 compute-1 sudo[161934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:24 compute-1 python3.9[161936]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:24 compute-1 sudo[161934]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:24 compute-1 sudo[162057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzityorvumdjneyeenptvelhrfblqtmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989644.0852017-325-119922597793337/AnsiballZ_copy.py'
Nov 24 13:07:24 compute-1 sudo[162057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:25 compute-1 python3.9[162059]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989644.0852017-325-119922597793337/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:25 compute-1 sudo[162057]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:25 compute-1 sshd-session[161807]: Invalid user ftpuser from 68.183.82.237 port 60054
Nov 24 13:07:25 compute-1 sshd-session[161807]: Received disconnect from 68.183.82.237 port 60054:11: Bye Bye [preauth]
Nov 24 13:07:25 compute-1 sshd-session[161807]: Disconnected from invalid user ftpuser 68.183.82.237 port 60054 [preauth]
Nov 24 13:07:25 compute-1 sudo[162209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecpsyjdwoihpthwvuvvxnvkggpkeqlhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989645.5068445-357-270020994479518/AnsiballZ_lineinfile.py'
Nov 24 13:07:25 compute-1 sudo[162209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:25 compute-1 python3.9[162211]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:25 compute-1 sudo[162209]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:26 compute-1 sudo[162361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqpskomruvpxnxcdkirkgduxjyqnmgsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989646.235826-373-3807119568698/AnsiballZ_systemd.py'
Nov 24 13:07:26 compute-1 sudo[162361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:27 compute-1 python3.9[162363]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:07:27 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 24 13:07:27 compute-1 systemd[1]: Stopped Load Kernel Modules.
Nov 24 13:07:27 compute-1 systemd[1]: Stopping Load Kernel Modules...
Nov 24 13:07:27 compute-1 systemd[1]: Starting Load Kernel Modules...
Nov 24 13:07:27 compute-1 systemd[1]: Finished Load Kernel Modules.
Nov 24 13:07:27 compute-1 sudo[162361]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:27 compute-1 sudo[162517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unufpgulegbxhrldxyywevijphjnbcbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989647.5864348-389-242309926582914/AnsiballZ_file.py'
Nov 24 13:07:27 compute-1 sudo[162517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:28 compute-1 python3.9[162519]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:07:28 compute-1 sudo[162517]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:28 compute-1 sudo[162669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcqdabmpuepcwqvdhuuxidpktukeppgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989648.353417-407-277494172917620/AnsiballZ_stat.py'
Nov 24 13:07:28 compute-1 sudo[162669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:28 compute-1 python3.9[162671]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:07:28 compute-1 sudo[162669]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:29 compute-1 sudo[162821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbhyeogqcwuwavvxklomyxxeptztfcpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989649.1269732-425-222455990809391/AnsiballZ_stat.py'
Nov 24 13:07:29 compute-1 sudo[162821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:29 compute-1 python3.9[162823]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:07:29 compute-1 sudo[162821]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:30 compute-1 sudo[162973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijgazrgwfizwnhlcphhwuettrecknhoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989649.8443894-441-275659209169078/AnsiballZ_stat.py'
Nov 24 13:07:30 compute-1 sudo[162973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:30 compute-1 python3.9[162975]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:30 compute-1 sudo[162973]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:30 compute-1 sudo[163096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znnpwrrvynexpipdzwchbctvxacpzqmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989649.8443894-441-275659209169078/AnsiballZ_copy.py'
Nov 24 13:07:30 compute-1 sudo[163096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:31 compute-1 python3.9[163098]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989649.8443894-441-275659209169078/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:31 compute-1 sudo[163096]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:31 compute-1 sudo[163248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjetxelhmhpgchrsydzfdwpmuzggjxsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989651.3156223-472-196197214973686/AnsiballZ_command.py'
Nov 24 13:07:31 compute-1 sudo[163248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:31 compute-1 python3.9[163250]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:07:31 compute-1 sshd-session[161462]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:07:31 compute-1 sshd-session[161462]: banner exchange: Connection from 218.56.160.82 port 29315: Connection timed out
Nov 24 13:07:32 compute-1 sudo[163248]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:32 compute-1 sudo[163401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilbvdxiufekjleupprylibbhcnekobeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989652.2589283-487-82517395744767/AnsiballZ_lineinfile.py'
Nov 24 13:07:32 compute-1 sudo[163401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:32 compute-1 python3.9[163403]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:32 compute-1 sudo[163401]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:33 compute-1 sudo[163553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvphelnhkzknggnelbsagwwzpdlwdxxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989653.066975-503-90943032522340/AnsiballZ_replace.py'
Nov 24 13:07:33 compute-1 sudo[163553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:33 compute-1 python3.9[163555]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:33 compute-1 sudo[163553]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:34 compute-1 sudo[163705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmmieaeaiddfspohdkipzlkoufdibelc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989653.9678047-519-187688538245286/AnsiballZ_replace.py'
Nov 24 13:07:34 compute-1 sudo[163705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:34 compute-1 python3.9[163707]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:34 compute-1 sudo[163705]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:35 compute-1 sudo[163857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvekpfepqyyelhjgejnjlxtmjlcmhotx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989654.76391-537-5389662247676/AnsiballZ_lineinfile.py'
Nov 24 13:07:35 compute-1 sudo[163857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:35 compute-1 python3.9[163859]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:35 compute-1 sudo[163857]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:35 compute-1 sudo[164011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgoyvoabttpziwiaaahdeiillnokuvhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989655.4908535-537-46608446853449/AnsiballZ_lineinfile.py'
Nov 24 13:07:35 compute-1 sudo[164011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:35 compute-1 python3.9[164013]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:36 compute-1 sudo[164011]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:36 compute-1 sshd-session[163928]: Invalid user in from 176.114.89.34 port 34290
Nov 24 13:07:36 compute-1 sudo[164163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orrufjmuejkxbgtqpaujqgtodkcsltph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989656.1232615-537-198450652416867/AnsiballZ_lineinfile.py'
Nov 24 13:07:36 compute-1 sudo[164163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:36 compute-1 sshd-session[163928]: Received disconnect from 176.114.89.34 port 34290:11: Bye Bye [preauth]
Nov 24 13:07:36 compute-1 sshd-session[163928]: Disconnected from invalid user in 176.114.89.34 port 34290 [preauth]
Nov 24 13:07:36 compute-1 python3.9[164165]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:36 compute-1 sudo[164163]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:36 compute-1 sudo[164315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrulupqzdvsytaxkpuyayltlypvhgnhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989656.6908257-537-263003583345888/AnsiballZ_lineinfile.py'
Nov 24 13:07:36 compute-1 sudo[164315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:37 compute-1 python3.9[164317]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:37 compute-1 sudo[164315]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:37 compute-1 sudo[164467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myinlrqxytvvmsrurgziqotrkxpzqxjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989657.6647327-595-119142472943600/AnsiballZ_stat.py'
Nov 24 13:07:37 compute-1 sudo[164467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:38 compute-1 python3.9[164469]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:07:38 compute-1 sudo[164467]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:38 compute-1 podman[164557]: 2025-11-24 13:07:38.505610704 +0000 UTC m=+0.052641854 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 24 13:07:38 compute-1 sudo[164641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxfxqagqifhasiddbqskxpoubpokfrhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989658.3397841-611-221266352172568/AnsiballZ_file.py'
Nov 24 13:07:38 compute-1 sudo[164641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:38 compute-1 python3.9[164643]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:38 compute-1 sudo[164641]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:39 compute-1 sudo[164793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efeiatbdctohkbldgbpaikxrhznvrdei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989659.140299-629-279195284484682/AnsiballZ_file.py'
Nov 24 13:07:39 compute-1 sudo[164793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:39 compute-1 python3.9[164795]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:07:39 compute-1 sudo[164793]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:40 compute-1 sudo[164945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chvlyfiyydcdtlilemannustzuvwfuwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989659.9263859-645-24330497316337/AnsiballZ_stat.py'
Nov 24 13:07:40 compute-1 sudo[164945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:40 compute-1 python3.9[164947]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:40 compute-1 sudo[164945]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:40 compute-1 sudo[165040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ychxnuxeablygoezepvpuziuzcciraoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989659.9263859-645-24330497316337/AnsiballZ_file.py'
Nov 24 13:07:40 compute-1 sudo[165040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:40 compute-1 podman[164997]: 2025-11-24 13:07:40.752697127 +0000 UTC m=+0.080694128 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 13:07:40 compute-1 python3.9[165047]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:07:41 compute-1 sudo[165040]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:41 compute-1 sudo[165201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dngiwjeyiaztlzwfndbgoqdyjtxpffex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989661.1346993-645-99957778336753/AnsiballZ_stat.py'
Nov 24 13:07:41 compute-1 sudo[165201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:41 compute-1 python3.9[165203]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:41 compute-1 sudo[165201]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:41 compute-1 sudo[165279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssqckvapducejlqyaarnaiadehgbmwnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989661.1346993-645-99957778336753/AnsiballZ_file.py'
Nov 24 13:07:41 compute-1 sudo[165279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:42 compute-1 python3.9[165281]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:07:42 compute-1 sudo[165279]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:42 compute-1 sudo[165431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zchhfxuqlbzsgsfuuqtpgxtujqocmoav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989662.2501056-691-210816383450808/AnsiballZ_file.py'
Nov 24 13:07:42 compute-1 sudo[165431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:42 compute-1 python3.9[165433]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:42 compute-1 sudo[165431]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:43 compute-1 sudo[165583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtehkpawropsiesxrtopthplvipbsbiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989663.0470293-707-93965352216612/AnsiballZ_stat.py'
Nov 24 13:07:43 compute-1 sudo[165583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:43 compute-1 python3.9[165585]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:43 compute-1 sudo[165583]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:43 compute-1 sudo[165661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-padvdlosjjkvqjoobreomtdhbpuonuxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989663.0470293-707-93965352216612/AnsiballZ_file.py'
Nov 24 13:07:43 compute-1 sudo[165661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:43 compute-1 python3.9[165663]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:43 compute-1 sudo[165661]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:44 compute-1 sudo[165814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsaeswutphpxlqjfuvtavevoomyygqmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989664.3210785-731-94483750762723/AnsiballZ_stat.py'
Nov 24 13:07:44 compute-1 sudo[165814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:44 compute-1 python3.9[165816]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:44 compute-1 sudo[165814]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:45 compute-1 sudo[165894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guuxvwtpzupluebsrhvslyumxvznumbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989664.3210785-731-94483750762723/AnsiballZ_file.py'
Nov 24 13:07:45 compute-1 sudo[165894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:45 compute-1 python3.9[165896]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:45 compute-1 sudo[165894]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:45 compute-1 sshd-session[165817]: Invalid user sol from 193.32.162.145 port 58456
Nov 24 13:07:45 compute-1 sshd-session[165817]: Connection closed by invalid user sol 193.32.162.145 port 58456 [preauth]
Nov 24 13:07:45 compute-1 sudo[166046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaurcnsmpelbrmhnrgdfjbjipbebiqte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989665.625237-755-152885912428352/AnsiballZ_systemd.py'
Nov 24 13:07:45 compute-1 sudo[166046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:46 compute-1 python3.9[166048]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:07:46 compute-1 systemd[1]: Reloading.
Nov 24 13:07:46 compute-1 systemd-rc-local-generator[166076]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:07:46 compute-1 systemd-sysv-generator[166079]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:07:46 compute-1 sudo[166046]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:47 compute-1 sudo[166236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jujcovydbyxxweokehfppwzbngdngiul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989666.869982-771-204627411951507/AnsiballZ_stat.py'
Nov 24 13:07:47 compute-1 sudo[166236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:47 compute-1 python3.9[166238]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:47 compute-1 sudo[166236]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:47 compute-1 sudo[166314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldpadebvdblgklymajbmwvdytochekii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989666.869982-771-204627411951507/AnsiballZ_file.py'
Nov 24 13:07:47 compute-1 sudo[166314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:47 compute-1 python3.9[166316]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:47 compute-1 sudo[166314]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:48 compute-1 sudo[166466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxtqfawveumcmzajhavpaagwqgxwwuvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989668.0664668-795-172416615942115/AnsiballZ_stat.py'
Nov 24 13:07:48 compute-1 sudo[166466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:48 compute-1 python3.9[166468]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:48 compute-1 sudo[166466]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:48 compute-1 sudo[166544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocwdvjzrbuqrclrsiuydyllrghavrofw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989668.0664668-795-172416615942115/AnsiballZ_file.py'
Nov 24 13:07:48 compute-1 sudo[166544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:48 compute-1 python3.9[166546]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:48 compute-1 sudo[166544]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:49 compute-1 sudo[166696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwgzeknzezmdwxgfnoyfnabefdjhxzbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989669.6144533-819-222804053481380/AnsiballZ_systemd.py'
Nov 24 13:07:49 compute-1 sudo[166696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:50 compute-1 python3.9[166698]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:07:50 compute-1 systemd[1]: Reloading.
Nov 24 13:07:50 compute-1 systemd-rc-local-generator[166727]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:07:50 compute-1 systemd-sysv-generator[166730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:07:50 compute-1 systemd[1]: Starting Create netns directory...
Nov 24 13:07:50 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 13:07:50 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 13:07:50 compute-1 systemd[1]: Finished Create netns directory.
Nov 24 13:07:50 compute-1 sudo[166696]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:51 compute-1 sudo[166891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqzziqndrprylikngpcvowpkjlbmgzde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989671.1063468-839-88714299071685/AnsiballZ_file.py'
Nov 24 13:07:51 compute-1 sudo[166891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:51 compute-1 python3.9[166893]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:07:51 compute-1 sudo[166891]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:52 compute-1 sudo[167043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wapvvohqumctcxxevhpqloxrutbcpcbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989671.8585706-855-239312180294698/AnsiballZ_stat.py'
Nov 24 13:07:52 compute-1 sudo[167043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:52 compute-1 python3.9[167045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:52 compute-1 sudo[167043]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:52 compute-1 sudo[167166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohnrclhhythejplnucrwbmrebvdoixzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989671.8585706-855-239312180294698/AnsiballZ_copy.py'
Nov 24 13:07:52 compute-1 sudo[167166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:52 compute-1 python3.9[167168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989671.8585706-855-239312180294698/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:07:52 compute-1 sudo[167166]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:53 compute-1 sudo[167318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvpscgdxfbmiqvskuavrzarjlzlydpgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989673.5442803-889-145936120837454/AnsiballZ_file.py'
Nov 24 13:07:53 compute-1 sudo[167318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:54 compute-1 python3.9[167320]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:07:54 compute-1 sudo[167318]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:54 compute-1 sshd-session[165664]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:07:54 compute-1 sshd-session[165664]: banner exchange: Connection from 218.56.160.82 port 31242: Connection timed out
Nov 24 13:07:54 compute-1 sudo[167470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fasftzmvonqqzngcezfvvocoponxmzdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989674.236967-905-115315822395235/AnsiballZ_stat.py'
Nov 24 13:07:54 compute-1 sudo[167470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:54 compute-1 python3.9[167472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:07:54 compute-1 sudo[167470]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:54 compute-1 sudo[167593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-admfufbesybarbrthkvimfyjiizcaqlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989674.236967-905-115315822395235/AnsiballZ_copy.py'
Nov 24 13:07:54 compute-1 sudo[167593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:55 compute-1 python3.9[167595]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989674.236967-905-115315822395235/.source.json _original_basename=.b99k2l8a follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:55 compute-1 sudo[167593]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:56 compute-1 sudo[167745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqvrhxbzikppvoyczakkamnkceimtuyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989675.7266634-935-163450196001355/AnsiballZ_file.py'
Nov 24 13:07:56 compute-1 sudo[167745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:56 compute-1 python3.9[167747]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:07:56 compute-1 sudo[167745]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:56 compute-1 sudo[167897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcqudyibcnvzctezcyympsccdubstakc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989676.6682181-951-79018744890751/AnsiballZ_stat.py'
Nov 24 13:07:56 compute-1 sudo[167897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:57 compute-1 sudo[167897]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:57 compute-1 sudo[168020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qobqzxcoizrxbudrpnaluipgvvhndmur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989676.6682181-951-79018744890751/AnsiballZ_copy.py'
Nov 24 13:07:57 compute-1 sudo[168020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:57 compute-1 sudo[168020]: pam_unix(sudo:session): session closed for user root
Nov 24 13:07:58 compute-1 sudo[168172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsqyoexhmzdssnuvcxbulgtputcrbujx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989678.2759366-985-211357627456624/AnsiballZ_container_config_data.py'
Nov 24 13:07:58 compute-1 sudo[168172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:07:59 compute-1 python3.9[168174]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 24 13:07:59 compute-1 sudo[168172]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:00 compute-1 sudo[168324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xynhfhkswhvnjtodojxbzpgsyptqsztk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989679.6322517-1003-169215249872519/AnsiballZ_container_config_hash.py'
Nov 24 13:08:00 compute-1 sudo[168324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:00 compute-1 python3.9[168326]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 13:08:00 compute-1 sudo[168324]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:01 compute-1 sudo[168476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbyxwkganlkedjzchboebiocmkgyaqft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989680.643202-1021-87115120645259/AnsiballZ_podman_container_info.py'
Nov 24 13:08:01 compute-1 sudo[168476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:01 compute-1 python3.9[168478]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 13:08:01 compute-1 sudo[168476]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:02 compute-1 sudo[168655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzhoxmgqamakpxizsgluwmvijwivcwoa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989682.2457633-1047-188039520405556/AnsiballZ_edpm_container_manage.py'
Nov 24 13:08:02 compute-1 sudo[168655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:03 compute-1 python3[168657]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 13:08:03 compute-1 podman[168694]: 2025-11-24 13:08:03.418367309 +0000 UTC m=+0.022691437 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 24 13:08:03 compute-1 podman[168694]: 2025-11-24 13:08:03.909020149 +0000 UTC m=+0.513344247 container create 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 24 13:08:03 compute-1 python3[168657]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 24 13:08:04 compute-1 sudo[168655]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:08:04.133 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:08:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:08:04.134 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:08:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:08:04.134 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:08:04 compute-1 sudo[168881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xucbmttzedhghnkexwakwicyaadstbrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989684.257697-1063-5370530252105/AnsiballZ_stat.py'
Nov 24 13:08:04 compute-1 sudo[168881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:04 compute-1 python3.9[168883]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:08:04 compute-1 sudo[168881]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:05 compute-1 sudo[169035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnedhdjsgpkrdyeyvsfgvhizpamogzky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989685.1334047-1081-232413402381020/AnsiballZ_file.py'
Nov 24 13:08:05 compute-1 sudo[169035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:05 compute-1 python3.9[169037]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:05 compute-1 sudo[169035]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:05 compute-1 sudo[169111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bndqvzkawncvrrtgyxaksfpzjvddhrag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989685.1334047-1081-232413402381020/AnsiballZ_stat.py'
Nov 24 13:08:05 compute-1 sudo[169111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:06 compute-1 python3.9[169113]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:08:06 compute-1 sudo[169111]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:06 compute-1 sudo[169262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fujgunccbegzbhxxwcpouacnoctltfjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989686.11126-1081-131563030365282/AnsiballZ_copy.py'
Nov 24 13:08:06 compute-1 sudo[169262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:06 compute-1 python3.9[169264]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763989686.11126-1081-131563030365282/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:06 compute-1 sudo[169262]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:07 compute-1 sudo[169338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gncxymnsvbzlvypkejufmtpgnhkbutpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989686.11126-1081-131563030365282/AnsiballZ_systemd.py'
Nov 24 13:08:07 compute-1 sudo[169338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:07 compute-1 python3.9[169340]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:08:07 compute-1 systemd[1]: Reloading.
Nov 24 13:08:07 compute-1 systemd-rc-local-generator[169367]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:08:07 compute-1 systemd-sysv-generator[169370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:08:07 compute-1 sudo[169338]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:07 compute-1 sudo[169449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enmbetudauwegrkizqosmwtlxkrgcnxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989686.11126-1081-131563030365282/AnsiballZ_systemd.py'
Nov 24 13:08:07 compute-1 sudo[169449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:08 compute-1 python3.9[169451]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:08 compute-1 systemd[1]: Reloading.
Nov 24 13:08:08 compute-1 systemd-rc-local-generator[169483]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:08:08 compute-1 systemd-sysv-generator[169486]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:08:08 compute-1 systemd[1]: Starting multipathd container...
Nov 24 13:08:08 compute-1 podman[169492]: 2025-11-24 13:08:08.754333266 +0000 UTC m=+0.083007712 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 24 13:08:08 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:08:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59b27529846b0af268650ee83ad48b6435bcfd956b50748de5575ba1ef98e464/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 13:08:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59b27529846b0af268650ee83ad48b6435bcfd956b50748de5575ba1ef98e464/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 13:08:08 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151.
Nov 24 13:08:08 compute-1 podman[169494]: 2025-11-24 13:08:08.864904607 +0000 UTC m=+0.194314343 container init 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Nov 24 13:08:08 compute-1 multipathd[169523]: + sudo -E kolla_set_configs
Nov 24 13:08:08 compute-1 sudo[169533]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 13:08:08 compute-1 podman[169494]: 2025-11-24 13:08:08.897517647 +0000 UTC m=+0.226927333 container start 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 24 13:08:08 compute-1 sudo[169533]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 13:08:08 compute-1 sudo[169533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 13:08:08 compute-1 podman[169494]: multipathd
Nov 24 13:08:08 compute-1 systemd[1]: Started multipathd container.
Nov 24 13:08:08 compute-1 sudo[169449]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:08 compute-1 multipathd[169523]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 13:08:08 compute-1 multipathd[169523]: INFO:__main__:Validating config file
Nov 24 13:08:08 compute-1 multipathd[169523]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 13:08:08 compute-1 multipathd[169523]: INFO:__main__:Writing out command to execute
Nov 24 13:08:08 compute-1 sudo[169533]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:08 compute-1 multipathd[169523]: ++ cat /run_command
Nov 24 13:08:08 compute-1 multipathd[169523]: + CMD='/usr/sbin/multipathd -d'
Nov 24 13:08:08 compute-1 multipathd[169523]: + ARGS=
Nov 24 13:08:08 compute-1 multipathd[169523]: + sudo kolla_copy_cacerts
Nov 24 13:08:08 compute-1 podman[169534]: 2025-11-24 13:08:08.96610971 +0000 UTC m=+0.058905807 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:08:08 compute-1 systemd[1]: 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151-29de8bc8646fcdb1.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 13:08:08 compute-1 systemd[1]: 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151-29de8bc8646fcdb1.service: Failed with result 'exit-code'.
Nov 24 13:08:08 compute-1 sudo[169557]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 13:08:08 compute-1 sudo[169557]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 13:08:08 compute-1 sudo[169557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 13:08:08 compute-1 sudo[169557]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:08 compute-1 multipathd[169523]: + [[ ! -n '' ]]
Nov 24 13:08:08 compute-1 multipathd[169523]: + . kolla_extend_start
Nov 24 13:08:08 compute-1 multipathd[169523]: Running command: '/usr/sbin/multipathd -d'
Nov 24 13:08:08 compute-1 multipathd[169523]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 24 13:08:08 compute-1 multipathd[169523]: + umask 0022
Nov 24 13:08:08 compute-1 multipathd[169523]: + exec /usr/sbin/multipathd -d
Nov 24 13:08:09 compute-1 multipathd[169523]: 2707.716790 | --------start up--------
Nov 24 13:08:09 compute-1 multipathd[169523]: 2707.716810 | read /etc/multipath.conf
Nov 24 13:08:09 compute-1 multipathd[169523]: 2707.724880 | path checkers start up
Nov 24 13:08:09 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 24 13:08:09 compute-1 python3.9[169717]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:08:10 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 24 13:08:10 compute-1 sudo[169870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brpwktzppfkcjtbebalzgomfnwpyznsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989690.0432348-1154-58147311551431/AnsiballZ_command.py'
Nov 24 13:08:10 compute-1 sudo[169870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:10 compute-1 python3.9[169872]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:08:10 compute-1 sudo[169870]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:10 compute-1 sshd-session[169892]: Invalid user proxyuser from 85.209.134.43 port 37216
Nov 24 13:08:10 compute-1 sshd-session[169892]: Received disconnect from 85.209.134.43 port 37216:11: Bye Bye [preauth]
Nov 24 13:08:10 compute-1 sshd-session[169892]: Disconnected from invalid user proxyuser 85.209.134.43 port 37216 [preauth]
Nov 24 13:08:10 compute-1 podman[169912]: 2025-11-24 13:08:10.893172551 +0000 UTC m=+0.097554413 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:08:11 compute-1 sudo[170063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuhxsoxkktmxvkrwgneugrnmtshwbhca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989690.931882-1169-190698343468014/AnsiballZ_systemd.py'
Nov 24 13:08:11 compute-1 sudo[170063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:11 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 24 13:08:11 compute-1 python3.9[170065]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:08:11 compute-1 systemd[1]: Stopping multipathd container...
Nov 24 13:08:11 compute-1 multipathd[169523]: 2710.407465 | exit (signal)
Nov 24 13:08:11 compute-1 multipathd[169523]: 2710.407576 | --------shut down-------
Nov 24 13:08:11 compute-1 systemd[1]: libpod-32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151.scope: Deactivated successfully.
Nov 24 13:08:11 compute-1 podman[170070]: 2025-11-24 13:08:11.725226203 +0000 UTC m=+0.093833180 container died 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:08:11 compute-1 systemd[1]: 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151-29de8bc8646fcdb1.timer: Deactivated successfully.
Nov 24 13:08:11 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151.
Nov 24 13:08:11 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151-userdata-shm.mount: Deactivated successfully.
Nov 24 13:08:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-59b27529846b0af268650ee83ad48b6435bcfd956b50748de5575ba1ef98e464-merged.mount: Deactivated successfully.
Nov 24 13:08:11 compute-1 podman[170070]: 2025-11-24 13:08:11.781151457 +0000 UTC m=+0.149758454 container cleanup 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:08:11 compute-1 podman[170070]: multipathd
Nov 24 13:08:11 compute-1 podman[170096]: multipathd
Nov 24 13:08:11 compute-1 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 24 13:08:11 compute-1 systemd[1]: Stopped multipathd container.
Nov 24 13:08:11 compute-1 systemd[1]: Starting multipathd container...
Nov 24 13:08:11 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:08:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59b27529846b0af268650ee83ad48b6435bcfd956b50748de5575ba1ef98e464/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 13:08:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59b27529846b0af268650ee83ad48b6435bcfd956b50748de5575ba1ef98e464/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 13:08:11 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151.
Nov 24 13:08:11 compute-1 podman[170110]: 2025-11-24 13:08:11.983903622 +0000 UTC m=+0.107367714 container init 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 13:08:11 compute-1 multipathd[170124]: + sudo -E kolla_set_configs
Nov 24 13:08:12 compute-1 podman[170110]: 2025-11-24 13:08:12.011251476 +0000 UTC m=+0.134715518 container start 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 24 13:08:12 compute-1 sudo[170130]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 13:08:12 compute-1 sudo[170130]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 13:08:12 compute-1 sudo[170130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 13:08:12 compute-1 podman[170110]: multipathd
Nov 24 13:08:12 compute-1 systemd[1]: Started multipathd container.
Nov 24 13:08:12 compute-1 multipathd[170124]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 13:08:12 compute-1 multipathd[170124]: INFO:__main__:Validating config file
Nov 24 13:08:12 compute-1 multipathd[170124]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 13:08:12 compute-1 multipathd[170124]: INFO:__main__:Writing out command to execute
Nov 24 13:08:12 compute-1 sudo[170063]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:12 compute-1 sudo[170130]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:12 compute-1 multipathd[170124]: ++ cat /run_command
Nov 24 13:08:12 compute-1 multipathd[170124]: + CMD='/usr/sbin/multipathd -d'
Nov 24 13:08:12 compute-1 multipathd[170124]: + ARGS=
Nov 24 13:08:12 compute-1 multipathd[170124]: + sudo kolla_copy_cacerts
Nov 24 13:08:12 compute-1 sudo[170148]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 13:08:12 compute-1 sudo[170148]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 13:08:12 compute-1 sudo[170148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 13:08:12 compute-1 sudo[170148]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:12 compute-1 multipathd[170124]: + [[ ! -n '' ]]
Nov 24 13:08:12 compute-1 multipathd[170124]: + . kolla_extend_start
Nov 24 13:08:12 compute-1 multipathd[170124]: Running command: '/usr/sbin/multipathd -d'
Nov 24 13:08:12 compute-1 multipathd[170124]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 24 13:08:12 compute-1 multipathd[170124]: + umask 0022
Nov 24 13:08:12 compute-1 multipathd[170124]: + exec /usr/sbin/multipathd -d
Nov 24 13:08:12 compute-1 podman[170131]: 2025-11-24 13:08:12.104199701 +0000 UTC m=+0.081127710 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 24 13:08:12 compute-1 multipathd[170124]: 2710.820227 | --------start up--------
Nov 24 13:08:12 compute-1 multipathd[170124]: 2710.820246 | read /etc/multipath.conf
Nov 24 13:08:12 compute-1 multipathd[170124]: 2710.826413 | path checkers start up
Nov 24 13:08:12 compute-1 systemd[1]: 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151-2d4188d2536382f2.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 13:08:12 compute-1 systemd[1]: 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151-2d4188d2536382f2.service: Failed with result 'exit-code'.
Nov 24 13:08:12 compute-1 sshd-session[168529]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:08:12 compute-1 sshd-session[168529]: banner exchange: Connection from 218.56.160.82 port 32519: Connection timed out
Nov 24 13:08:12 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 24 13:08:12 compute-1 sudo[170315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmcygwmsodtbwfhafmmjmydqbsxlfiuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989692.4409187-1185-143571750260558/AnsiballZ_file.py'
Nov 24 13:08:12 compute-1 sudo[170315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:12 compute-1 python3.9[170317]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:12 compute-1 sudo[170315]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:13 compute-1 sudo[170467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btoakrdfihyahwhjcbdeirizbadyllyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989693.5130837-1209-92148491997072/AnsiballZ_file.py'
Nov 24 13:08:13 compute-1 sudo[170467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:13 compute-1 python3.9[170469]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 13:08:14 compute-1 sudo[170467]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:14 compute-1 sudo[170619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgbkqpicnyhdicvkrfvcuwtsxbktgpcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989694.2202494-1225-38771613922476/AnsiballZ_modprobe.py'
Nov 24 13:08:14 compute-1 sudo[170619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:14 compute-1 python3.9[170621]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 24 13:08:14 compute-1 kernel: Key type psk registered
Nov 24 13:08:14 compute-1 sudo[170619]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:15 compute-1 sudo[170781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqgfeilfpqyqobbqjvlqrtygzcjwouvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989695.0526922-1241-16192922241533/AnsiballZ_stat.py'
Nov 24 13:08:15 compute-1 sudo[170781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:15 compute-1 python3.9[170783]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:08:15 compute-1 sudo[170781]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:15 compute-1 sudo[170906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrwmjgqhlilciqkubruefmazaghnxqcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989695.0526922-1241-16192922241533/AnsiballZ_copy.py'
Nov 24 13:08:15 compute-1 sudo[170906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:16 compute-1 python3.9[170908]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989695.0526922-1241-16192922241533/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:16 compute-1 sudo[170906]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:16 compute-1 sshd-session[170791]: Received disconnect from 5.198.176.28 port 42548:11: Bye Bye [preauth]
Nov 24 13:08:16 compute-1 sshd-session[170791]: Disconnected from authenticating user root 5.198.176.28 port 42548 [preauth]
Nov 24 13:08:17 compute-1 sudo[171058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nawnizzepeutliidraljmgnrzajmtoyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989696.725689-1273-274387479355697/AnsiballZ_lineinfile.py'
Nov 24 13:08:17 compute-1 sudo[171058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:17 compute-1 python3.9[171060]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:17 compute-1 sudo[171058]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:17 compute-1 sudo[171210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkqfhiavxkiwjibvgznrcgqxtcakmkql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989697.5609107-1289-154525797405012/AnsiballZ_systemd.py'
Nov 24 13:08:17 compute-1 sudo[171210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:18 compute-1 python3.9[171212]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:08:18 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 24 13:08:18 compute-1 systemd[1]: Stopped Load Kernel Modules.
Nov 24 13:08:18 compute-1 systemd[1]: Stopping Load Kernel Modules...
Nov 24 13:08:18 compute-1 systemd[1]: Starting Load Kernel Modules...
Nov 24 13:08:18 compute-1 systemd[1]: Finished Load Kernel Modules.
Nov 24 13:08:18 compute-1 sudo[171210]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:18 compute-1 sudo[171367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-impnnyxkwppxfqbwpoannzwrmotwhegn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989698.6091337-1305-170700740928559/AnsiballZ_dnf.py'
Nov 24 13:08:18 compute-1 sudo[171367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:19 compute-1 python3.9[171369]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 13:08:21 compute-1 systemd[1]: Reloading.
Nov 24 13:08:21 compute-1 systemd-sysv-generator[171405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:08:21 compute-1 systemd-rc-local-generator[171402]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:08:21 compute-1 systemd[1]: Reloading.
Nov 24 13:08:21 compute-1 systemd-sysv-generator[171439]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:08:21 compute-1 systemd-rc-local-generator[171436]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:08:22 compute-1 systemd-logind[815]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 24 13:08:22 compute-1 systemd-logind[815]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 24 13:08:22 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 13:08:22 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 24 13:08:22 compute-1 systemd[1]: Reloading.
Nov 24 13:08:22 compute-1 systemd-rc-local-generator[171531]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:08:22 compute-1 systemd-sysv-generator[171534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:08:22 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 13:08:23 compute-1 sudo[171367]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:23 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 13:08:23 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 24 13:08:23 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.554s CPU time.
Nov 24 13:08:23 compute-1 systemd[1]: run-r06180949e9284aa28cd698130143e65c.service: Deactivated successfully.
Nov 24 13:08:23 compute-1 sudo[172819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-novtaqcjpdgtbxtuskwrlaqbazetisrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989703.4310923-1321-276793274653750/AnsiballZ_systemd_service.py'
Nov 24 13:08:23 compute-1 sudo[172819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:24 compute-1 python3.9[172821]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:08:24 compute-1 systemd[1]: Stopping Open-iSCSI...
Nov 24 13:08:24 compute-1 iscsid[161175]: iscsid shutting down.
Nov 24 13:08:24 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Nov 24 13:08:24 compute-1 systemd[1]: Stopped Open-iSCSI.
Nov 24 13:08:24 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 24 13:08:24 compute-1 systemd[1]: Starting Open-iSCSI...
Nov 24 13:08:24 compute-1 systemd[1]: Started Open-iSCSI.
Nov 24 13:08:24 compute-1 sudo[172819]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:25 compute-1 python3.9[172975]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:08:26 compute-1 sudo[173129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgudtqeaywrdqjjtaihfgfdpdsmaqdsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989705.787954-1357-163598310031019/AnsiballZ_file.py'
Nov 24 13:08:26 compute-1 sudo[173129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:26 compute-1 python3.9[173131]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:26 compute-1 sudo[173129]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:27 compute-1 sudo[173281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xblbsfqoodqsyhegubvqhfvquqgohjyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989706.7985108-1378-87159628421603/AnsiballZ_systemd_service.py'
Nov 24 13:08:27 compute-1 sudo[173281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:27 compute-1 python3.9[173283]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:08:27 compute-1 systemd[1]: Reloading.
Nov 24 13:08:27 compute-1 systemd-rc-local-generator[173309]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:08:27 compute-1 systemd-sysv-generator[173313]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:08:27 compute-1 sudo[173281]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:28 compute-1 python3.9[173467]: ansible-ansible.builtin.service_facts Invoked
Nov 24 13:08:28 compute-1 network[173484]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 13:08:28 compute-1 network[173485]: 'network-scripts' will be removed from distribution in near future.
Nov 24 13:08:28 compute-1 network[173486]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 13:08:28 compute-1 sshd-session[171241]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:08:28 compute-1 sshd-session[171241]: banner exchange: Connection from 218.56.160.82 port 33447: Connection timed out
Nov 24 13:08:34 compute-1 sshd-session[173599]: Invalid user support from 175.100.24.139 port 60370
Nov 24 13:08:34 compute-1 sshd-session[173599]: Received disconnect from 175.100.24.139 port 60370:11: Bye Bye [preauth]
Nov 24 13:08:34 compute-1 sshd-session[173599]: Disconnected from invalid user support 175.100.24.139 port 60370 [preauth]
Nov 24 13:08:34 compute-1 sudo[173763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjwcdcqsxafodeldyqoxhjyxulrfretk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989714.603982-1416-245586373843059/AnsiballZ_systemd_service.py'
Nov 24 13:08:34 compute-1 sudo[173763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:35 compute-1 python3.9[173765]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:35 compute-1 sudo[173763]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:35 compute-1 sudo[173916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgozhnkwkdmutolkzrlxmrhyqlvgqdwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989715.5075123-1416-273296506214100/AnsiballZ_systemd_service.py'
Nov 24 13:08:35 compute-1 sudo[173916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:36 compute-1 python3.9[173918]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:36 compute-1 sudo[173916]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:36 compute-1 sudo[174069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpnrbvqtackoygpyevsmzziochyompto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989716.3307219-1416-120839279987052/AnsiballZ_systemd_service.py'
Nov 24 13:08:36 compute-1 sudo[174069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:36 compute-1 python3.9[174071]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:37 compute-1 sudo[174069]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:37 compute-1 sudo[174222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjezctjfykqwapggbqqffikctrzlamas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989717.1743784-1416-194287686421038/AnsiballZ_systemd_service.py'
Nov 24 13:08:37 compute-1 sudo[174222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:37 compute-1 python3.9[174224]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:37 compute-1 sudo[174222]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:37 compute-1 sshd[128576]: Timeout before authentication for connection from 120.48.130.213 to 38.102.83.173, pid = 157130
Nov 24 13:08:38 compute-1 sudo[174377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viaqzwwmjgiykpxduzosvbjtpdyewgxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989717.9131415-1416-280299864092610/AnsiballZ_systemd_service.py'
Nov 24 13:08:38 compute-1 sudo[174377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:38 compute-1 python3.9[174379]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:38 compute-1 sudo[174377]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:38 compute-1 sudo[174541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajobccpbfagetnxnsjlnxobplytgxgma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989718.6686153-1416-218584214553554/AnsiballZ_systemd_service.py'
Nov 24 13:08:38 compute-1 sudo[174541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:38 compute-1 podman[174504]: 2025-11-24 13:08:38.998134239 +0000 UTC m=+0.059733252 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:08:39 compute-1 sshd-session[174226]: Invalid user aaa from 68.183.82.237 port 35132
Nov 24 13:08:39 compute-1 sshd-session[174226]: Received disconnect from 68.183.82.237 port 35132:11: Bye Bye [preauth]
Nov 24 13:08:39 compute-1 sshd-session[174226]: Disconnected from invalid user aaa 68.183.82.237 port 35132 [preauth]
Nov 24 13:08:39 compute-1 python3.9[174549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:39 compute-1 sudo[174541]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:39 compute-1 sudo[174702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmclehpjqfiftqrzhrdvmcgpktplmzjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989719.428392-1416-232603625772925/AnsiballZ_systemd_service.py'
Nov 24 13:08:39 compute-1 sudo[174702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:39 compute-1 python3.9[174704]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:40 compute-1 sudo[174702]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:40 compute-1 sshd-session[173636]: Invalid user syncthing from 45.78.194.40 port 36962
Nov 24 13:08:40 compute-1 sudo[174855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoogvujjcvsxjydaaazahmgdshbmccbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989720.1184983-1416-167415907469514/AnsiballZ_systemd_service.py'
Nov 24 13:08:40 compute-1 sudo[174855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:40 compute-1 python3.9[174857]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:08:40 compute-1 sudo[174855]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:40 compute-1 sshd-session[173636]: Received disconnect from 45.78.194.40 port 36962:11: Bye Bye [preauth]
Nov 24 13:08:40 compute-1 sshd-session[173636]: Disconnected from invalid user syncthing 45.78.194.40 port 36962 [preauth]
Nov 24 13:08:41 compute-1 podman[174936]: 2025-11-24 13:08:41.580717671 +0000 UTC m=+0.112446267 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:08:41 compute-1 sudo[175036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcpzuconsfrohdravatpgbgsbhphkzjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989721.347493-1534-107780442410142/AnsiballZ_file.py'
Nov 24 13:08:41 compute-1 sudo[175036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:41 compute-1 sshd-session[174883]: Invalid user user2 from 176.114.89.34 port 36148
Nov 24 13:08:41 compute-1 python3.9[175038]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:41 compute-1 sudo[175036]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:41 compute-1 sshd-session[174883]: Received disconnect from 176.114.89.34 port 36148:11: Bye Bye [preauth]
Nov 24 13:08:41 compute-1 sshd-session[174883]: Disconnected from invalid user user2 176.114.89.34 port 36148 [preauth]
Nov 24 13:08:42 compute-1 sudo[175199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxrcyqhmnpvedqsxiwxgbcyntczehoeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989722.0624712-1534-264151955471502/AnsiballZ_file.py'
Nov 24 13:08:42 compute-1 sudo[175199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:42 compute-1 podman[175162]: 2025-11-24 13:08:42.422809467 +0000 UTC m=+0.067476929 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 24 13:08:42 compute-1 python3.9[175206]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:42 compute-1 sudo[175199]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:43 compute-1 sudo[175360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avkshikejywhgcxdgcfdobmikusmxigc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989722.7381327-1534-139394559188328/AnsiballZ_file.py'
Nov 24 13:08:43 compute-1 sudo[175360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:43 compute-1 python3.9[175362]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:43 compute-1 sudo[175360]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:43 compute-1 sudo[175512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryppdhgjroxkflhdxssvcollustsjcdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989723.4599438-1534-200506000441115/AnsiballZ_file.py'
Nov 24 13:08:43 compute-1 sudo[175512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:44 compute-1 python3.9[175514]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:44 compute-1 sudo[175512]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:44 compute-1 sudo[175664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqcdcvvvidqebcljywpwxnauzgofywty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989724.1643834-1534-96427163542029/AnsiballZ_file.py'
Nov 24 13:08:44 compute-1 sudo[175664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:44 compute-1 sshd-session[173635]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:08:44 compute-1 sshd-session[173635]: banner exchange: Connection from 218.56.160.82 port 34750: Connection timed out
Nov 24 13:08:44 compute-1 python3.9[175666]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:44 compute-1 sudo[175664]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:45 compute-1 sudo[175816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mepqvhlhknsgjqluwmcwnscbvxadvpyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989724.816045-1534-122084380314342/AnsiballZ_file.py'
Nov 24 13:08:45 compute-1 sudo[175816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:45 compute-1 python3.9[175818]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:45 compute-1 sudo[175816]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:45 compute-1 sudo[175968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijqybhfgsivfkkgatbeqltxcjridirzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989725.4398713-1534-240131480624675/AnsiballZ_file.py'
Nov 24 13:08:45 compute-1 sudo[175968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:45 compute-1 python3.9[175970]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:45 compute-1 sudo[175968]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:46 compute-1 sudo[176120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyjybaqsuzfzwgcxscygnwueeguutukz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989726.053994-1534-25011279802783/AnsiballZ_file.py'
Nov 24 13:08:46 compute-1 sudo[176120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:46 compute-1 python3.9[176122]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:46 compute-1 sudo[176120]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:47 compute-1 sudo[176272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeghzludbpankiapydvwtcluuaaagfln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989727.0860548-1648-224258711877743/AnsiballZ_file.py'
Nov 24 13:08:47 compute-1 sudo[176272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:47 compute-1 python3.9[176274]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:47 compute-1 sudo[176272]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:48 compute-1 sudo[176424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrbbqfqldumtvxxkkwgevxyckebpbgzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989727.963805-1648-122538419338045/AnsiballZ_file.py'
Nov 24 13:08:48 compute-1 sudo[176424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:48 compute-1 python3.9[176426]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:48 compute-1 sudo[176424]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:48 compute-1 sudo[176576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lckbuouxsdhtpzxtewxdhhnlgbstpvxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989728.5605028-1648-32589215023676/AnsiballZ_file.py'
Nov 24 13:08:48 compute-1 sudo[176576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:49 compute-1 python3.9[176578]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:49 compute-1 sudo[176576]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:49 compute-1 sudo[176728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbhvfmtavjtzrinzufsklqqjpujhxoct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989729.15225-1648-230178597321902/AnsiballZ_file.py'
Nov 24 13:08:49 compute-1 sudo[176728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:49 compute-1 python3.9[176730]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:49 compute-1 sudo[176728]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:50 compute-1 sudo[176881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpsfzlbnxjijuhyzgemltlfhcsmyejvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989729.7839818-1648-100695032671187/AnsiballZ_file.py'
Nov 24 13:08:50 compute-1 sudo[176881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:50 compute-1 python3.9[176883]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:50 compute-1 sudo[176881]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:50 compute-1 sudo[177033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkrkokmhzfbygpngmdyutansittkusot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989730.44438-1648-187489622287280/AnsiballZ_file.py'
Nov 24 13:08:50 compute-1 sudo[177033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:50 compute-1 python3.9[177035]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:50 compute-1 sudo[177033]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:51 compute-1 sudo[177185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abzuxatqmrxyoqqnonzkvqnvyhgcpgzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989731.0760436-1648-246427098254123/AnsiballZ_file.py'
Nov 24 13:08:51 compute-1 sudo[177185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:51 compute-1 python3.9[177187]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:51 compute-1 sudo[177185]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:52 compute-1 sudo[177337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvwyrxwvkdwgjqfemfnytonbexrvyky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989731.779861-1648-173295671443115/AnsiballZ_file.py'
Nov 24 13:08:52 compute-1 sudo[177337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:52 compute-1 python3.9[177339]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:08:52 compute-1 sudo[177337]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:54 compute-1 sudo[177489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfhvhnxnntcwwtsqzfqoghpbqdqvrubf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989733.8922498-1765-196590981133974/AnsiballZ_command.py'
Nov 24 13:08:54 compute-1 sudo[177489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:54 compute-1 python3.9[177491]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:08:54 compute-1 sudo[177489]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:55 compute-1 python3.9[177643]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 13:08:55 compute-1 sudo[177793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zurkqhlnhvsxcpofqteumziyehmcgryy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989735.4940076-1800-37542480598684/AnsiballZ_systemd_service.py'
Nov 24 13:08:55 compute-1 sudo[177793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:56 compute-1 python3.9[177795]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:08:56 compute-1 systemd[1]: Reloading.
Nov 24 13:08:56 compute-1 systemd-rc-local-generator[177820]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:08:56 compute-1 systemd-sysv-generator[177823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:08:56 compute-1 sudo[177793]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:56 compute-1 sudo[177979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ougeabuakjsjnfumeovnskfurtvuxhiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989736.7117248-1816-114466046878109/AnsiballZ_command.py'
Nov 24 13:08:56 compute-1 sudo[177979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:57 compute-1 python3.9[177981]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:08:57 compute-1 sudo[177979]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:57 compute-1 sudo[178132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etcdisewxgceuextflhafqbweqygvuec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989737.2752519-1816-6602650794994/AnsiballZ_command.py'
Nov 24 13:08:57 compute-1 sudo[178132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:57 compute-1 python3.9[178134]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:08:57 compute-1 sudo[178132]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:58 compute-1 sudo[178285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phrtjzlnvgrphiynwhvnhbpyyhhmyatj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989737.8080869-1816-129040263821624/AnsiballZ_command.py'
Nov 24 13:08:58 compute-1 sudo[178285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:58 compute-1 python3.9[178287]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:08:58 compute-1 sudo[178285]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:58 compute-1 sudo[178439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umyundkryjrystomlqpstbwunhhimpnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989738.4175296-1816-40873477490849/AnsiballZ_command.py'
Nov 24 13:08:58 compute-1 sudo[178439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:58 compute-1 python3.9[178441]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:08:58 compute-1 sudo[178439]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:59 compute-1 sudo[178592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfrmvmloutfbnjzzutnditblknqcumly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989739.0233772-1816-116844912527799/AnsiballZ_command.py'
Nov 24 13:08:59 compute-1 sudo[178592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:08:59 compute-1 python3.9[178594]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:08:59 compute-1 sudo[178592]: pam_unix(sudo:session): session closed for user root
Nov 24 13:08:59 compute-1 sudo[178745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhfbfflhmeclgngfxmxfdtpijaccoiur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989739.5954993-1816-56868426034170/AnsiballZ_command.py'
Nov 24 13:08:59 compute-1 sudo[178745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:00 compute-1 python3.9[178747]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:09:00 compute-1 sudo[178745]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:00 compute-1 sshd-session[176854]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:09:00 compute-1 sshd-session[176854]: banner exchange: Connection from 218.56.160.82 port 35818: Connection timed out
Nov 24 13:09:00 compute-1 sudo[178898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbphncaeaupauejpejtkaaqwrjymnktd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989740.269668-1816-34746388009635/AnsiballZ_command.py'
Nov 24 13:09:00 compute-1 sudo[178898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:00 compute-1 python3.9[178900]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:09:00 compute-1 sudo[178898]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:00 compute-1 sshd-session[178901]: Invalid user sol from 45.148.10.240 port 37990
Nov 24 13:09:01 compute-1 sshd-session[178901]: Connection closed by invalid user sol 45.148.10.240 port 37990 [preauth]
Nov 24 13:09:01 compute-1 sudo[179054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrveunxrjzjcybrgkwoifjdmnyyrfgdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989740.8362405-1816-148911698472010/AnsiballZ_command.py'
Nov 24 13:09:01 compute-1 sudo[179054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:01 compute-1 python3.9[179056]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:09:01 compute-1 sudo[179054]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:03 compute-1 sudo[179207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohdagmmgrpzqykozpamduswsmbcijuar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989743.6399643-1959-134244843455632/AnsiballZ_file.py'
Nov 24 13:09:03 compute-1 sudo[179207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:04 compute-1 python3.9[179209]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:04 compute-1 sudo[179207]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:09:04.134 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:09:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:09:04.136 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:09:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:09:04.136 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:09:04 compute-1 sudo[179359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qalxgyjgrzqaxbuiqpwnqttqslbetkva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989744.2089598-1959-196879424908078/AnsiballZ_file.py'
Nov 24 13:09:04 compute-1 sudo[179359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:04 compute-1 python3.9[179361]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:04 compute-1 sudo[179359]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:05 compute-1 sudo[179511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wntmsonspbbkmkzhrnlxxpmdofzqddvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989744.776272-1959-273758121197652/AnsiballZ_file.py'
Nov 24 13:09:05 compute-1 sudo[179511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:05 compute-1 python3.9[179513]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:05 compute-1 sudo[179511]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:06 compute-1 sudo[179663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmmdjlsvozpsexxggmohvdipdmaocbue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989745.8206155-2003-200929742884556/AnsiballZ_file.py'
Nov 24 13:09:06 compute-1 sudo[179663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:06 compute-1 python3.9[179665]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:06 compute-1 sudo[179663]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:06 compute-1 sudo[179815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldzekgjgnwocazfriwapgijnotgbaujb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989746.5046668-2003-250503363983673/AnsiballZ_file.py'
Nov 24 13:09:06 compute-1 sudo[179815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:07 compute-1 python3.9[179817]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:07 compute-1 sudo[179815]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:07 compute-1 sudo[179967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhqpngtpjywvuyjufvfugflifgfbrfpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989747.1935837-2003-147004651244502/AnsiballZ_file.py'
Nov 24 13:09:07 compute-1 sudo[179967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:07 compute-1 python3.9[179969]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:07 compute-1 sudo[179967]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:08 compute-1 sudo[180119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkahxmtvkbcrwlfdsuqznszhxdomssat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989747.8563359-2003-194363631814295/AnsiballZ_file.py'
Nov 24 13:09:08 compute-1 sudo[180119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:08 compute-1 python3.9[180121]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:08 compute-1 sudo[180119]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:08 compute-1 sudo[180271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsejquhqmeieignrnuwwvgtaadnpjiaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989748.5499563-2003-155264880110757/AnsiballZ_file.py'
Nov 24 13:09:08 compute-1 sudo[180271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:09 compute-1 python3.9[180273]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:09 compute-1 sshd-session[180274]: Received disconnect from 85.209.134.43 port 58532:11: Bye Bye [preauth]
Nov 24 13:09:09 compute-1 sshd-session[180274]: Disconnected from authenticating user root 85.209.134.43 port 58532 [preauth]
Nov 24 13:09:09 compute-1 sudo[180271]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:09 compute-1 podman[180276]: 2025-11-24 13:09:09.169305161 +0000 UTC m=+0.053322803 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:09:09 compute-1 sudo[180447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnftjqnkulycxmbyycyiycsalcfpxqqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989749.2596843-2003-100712529134078/AnsiballZ_file.py'
Nov 24 13:09:09 compute-1 sudo[180447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:09 compute-1 python3.9[180449]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:09 compute-1 sudo[180447]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:10 compute-1 sudo[180599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulbgpksaksvmoyfeaqoekmmwlavcxlyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989749.974293-2003-95772503825543/AnsiballZ_file.py'
Nov 24 13:09:10 compute-1 sudo[180599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:10 compute-1 python3.9[180601]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:10 compute-1 sudo[180599]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:12 compute-1 podman[180627]: 2025-11-24 13:09:12.505252276 +0000 UTC m=+0.053510278 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:09:12 compute-1 podman[180626]: 2025-11-24 13:09:12.527019245 +0000 UTC m=+0.080142762 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:09:15 compute-1 sudo[180795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnqrltxideqynmoayzyjuozcfyjuzuyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989755.5313115-2240-216873025241002/AnsiballZ_getent.py'
Nov 24 13:09:15 compute-1 sudo[180795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:16 compute-1 python3.9[180797]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 24 13:09:16 compute-1 sudo[180795]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:17 compute-1 sudo[180948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zihjnaciruljywlscneexlztyyesbgvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989756.5633538-2256-98491546641785/AnsiballZ_group.py'
Nov 24 13:09:17 compute-1 sudo[180948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:17 compute-1 python3.9[180950]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 13:09:17 compute-1 groupadd[180951]: group added to /etc/group: name=nova, GID=42436
Nov 24 13:09:17 compute-1 groupadd[180951]: group added to /etc/gshadow: name=nova
Nov 24 13:09:17 compute-1 groupadd[180951]: new group: name=nova, GID=42436
Nov 24 13:09:17 compute-1 sudo[180948]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:18 compute-1 sudo[181106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjedlrbofddinheboekcgqthemaopdvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989757.605318-2272-135861154198355/AnsiballZ_user.py'
Nov 24 13:09:18 compute-1 sudo[181106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:18 compute-1 python3.9[181108]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 13:09:18 compute-1 useradd[181110]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 24 13:09:18 compute-1 useradd[181110]: add 'nova' to group 'libvirt'
Nov 24 13:09:18 compute-1 useradd[181110]: add 'nova' to shadow group 'libvirt'
Nov 24 13:09:18 compute-1 sudo[181106]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:19 compute-1 sshd-session[181141]: Accepted publickey for zuul from 192.168.122.30 port 36988 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:09:19 compute-1 systemd-logind[815]: New session 26 of user zuul.
Nov 24 13:09:19 compute-1 systemd[1]: Started Session 26 of User zuul.
Nov 24 13:09:19 compute-1 sshd-session[181141]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:09:19 compute-1 sshd-session[181144]: Received disconnect from 192.168.122.30 port 36988:11: disconnected by user
Nov 24 13:09:19 compute-1 sshd-session[181144]: Disconnected from user zuul 192.168.122.30 port 36988
Nov 24 13:09:19 compute-1 sshd-session[181141]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:09:19 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Nov 24 13:09:19 compute-1 systemd-logind[815]: Session 26 logged out. Waiting for processes to exit.
Nov 24 13:09:19 compute-1 systemd-logind[815]: Removed session 26.
Nov 24 13:09:19 compute-1 sshd-session[180414]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:09:19 compute-1 sshd-session[180414]: banner exchange: Connection from 218.56.160.82 port 22329: Connection timed out
Nov 24 13:09:20 compute-1 python3.9[181294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:21 compute-1 python3.9[181415]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989760.0771942-2322-152161280002013/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:21 compute-1 python3.9[181565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:22 compute-1 python3.9[181641]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:23 compute-1 python3.9[181791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:23 compute-1 python3.9[181912]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989762.651961-2322-22363363833716/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:24 compute-1 python3.9[182062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:24 compute-1 python3.9[182185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989763.8637526-2322-234598731532397/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:25 compute-1 sshd-session[182133]: Invalid user user from 5.198.176.28 port 42654
Nov 24 13:09:25 compute-1 sshd-session[182133]: Received disconnect from 5.198.176.28 port 42654:11: Bye Bye [preauth]
Nov 24 13:09:25 compute-1 sshd-session[182133]: Disconnected from invalid user user 5.198.176.28 port 42654 [preauth]
Nov 24 13:09:25 compute-1 python3.9[182336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:26 compute-1 python3.9[182457]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989765.132723-2322-192124523680824/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:26 compute-1 python3.9[182607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:27 compute-1 python3.9[182728]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989766.4755807-2322-275597865877768/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:28 compute-1 sudo[182878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lklzoepopkgowhekcnecmuchitezeyds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989768.5593593-2489-189149023319497/AnsiballZ_file.py'
Nov 24 13:09:28 compute-1 sudo[182878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:29 compute-1 python3.9[182880]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:09:29 compute-1 sudo[182878]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:29 compute-1 sudo[183030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgewivpxupvzerfxazcxcwgcbhdrqic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989769.3554397-2504-4336690397780/AnsiballZ_copy.py'
Nov 24 13:09:29 compute-1 sudo[183030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:29 compute-1 python3.9[183032]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:09:29 compute-1 sudo[183030]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:30 compute-1 sudo[183182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eborshubksbqbggnaqxedhljhaeaqmkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989770.1367507-2520-147795476580484/AnsiballZ_stat.py'
Nov 24 13:09:30 compute-1 sudo[183182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:30 compute-1 python3.9[183184]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:09:30 compute-1 sudo[183182]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:31 compute-1 sudo[183334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avjhepntqccovkjxpppnalvatzgjglvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989770.9088373-2537-240754504193883/AnsiballZ_stat.py'
Nov 24 13:09:31 compute-1 sudo[183334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:31 compute-1 python3.9[183336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:31 compute-1 sudo[183334]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:31 compute-1 sudo[183457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fesbebaknkacazcoqwidjhgnlfhskfnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989770.9088373-2537-240754504193883/AnsiballZ_copy.py'
Nov 24 13:09:31 compute-1 sudo[183457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:32 compute-1 python3.9[183459]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1763989770.9088373-2537-240754504193883/.source _original_basename=.0opgcmrf follow=False checksum=9e28d32b17712b9cfd9e9f808ebb58ce6cc98b9e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 24 13:09:32 compute-1 sudo[183457]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:33 compute-1 python3.9[183611]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:09:33 compute-1 python3.9[183763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:34 compute-1 python3.9[183884]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989773.3981364-2589-14225679529000/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:35 compute-1 python3.9[184034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:09:35 compute-1 sshd-session[182215]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:09:35 compute-1 sshd-session[182215]: banner exchange: Connection from 218.56.160.82 port 21238: Connection timed out
Nov 24 13:09:35 compute-1 python3.9[184155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989774.7480702-2618-245330621855263/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:09:36 compute-1 sudo[184305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppwojrwqjqxggwpruavpvfvfepmyhtvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989776.176108-2652-45010729033117/AnsiballZ_container_config_data.py'
Nov 24 13:09:36 compute-1 sudo[184305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:36 compute-1 python3.9[184307]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 24 13:09:36 compute-1 sudo[184305]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:37 compute-1 sudo[184457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjmnwtrkaccmprkfwdjmcyrugrreoudq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989777.0436065-2670-28204024411039/AnsiballZ_container_config_hash.py'
Nov 24 13:09:37 compute-1 sudo[184457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:37 compute-1 python3.9[184459]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 13:09:37 compute-1 sudo[184457]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:38 compute-1 sudo[184609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fehytpsoemylxejthiypobiaybhfdkjx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989777.9573061-2690-190349408466746/AnsiballZ_edpm_container_manage.py'
Nov 24 13:09:38 compute-1 sudo[184609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:38 compute-1 python3[184611]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 13:09:38 compute-1 podman[184650]: 2025-11-24 13:09:38.820167516 +0000 UTC m=+0.066723331 container create 90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=nova_compute_init, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 24 13:09:38 compute-1 podman[184650]: 2025-11-24 13:09:38.79455341 +0000 UTC m=+0.041109225 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 24 13:09:38 compute-1 python3[184611]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 24 13:09:38 compute-1 sudo[184609]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:39 compute-1 podman[184753]: 2025-11-24 13:09:39.537330799 +0000 UTC m=+0.075300786 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 24 13:09:39 compute-1 sudo[184857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbkggocvbqaxlpcahimzqotfwtbufmkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989779.38893-2706-152184327028319/AnsiballZ_stat.py'
Nov 24 13:09:39 compute-1 sudo[184857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:39 compute-1 python3.9[184859]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:09:39 compute-1 sudo[184857]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:40 compute-1 sudo[185011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stufrajilkwzazsuwwngrcyoqbjpjuhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989780.5636368-2730-146696900761397/AnsiballZ_container_config_data.py'
Nov 24 13:09:40 compute-1 sudo[185011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:41 compute-1 python3.9[185013]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 24 13:09:41 compute-1 sudo[185011]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:41 compute-1 sudo[185163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sregtenbsurychnojhswdzcvfsctdtjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989781.4758925-2748-144903338461762/AnsiballZ_container_config_hash.py'
Nov 24 13:09:41 compute-1 sudo[185163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:42 compute-1 python3.9[185165]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 13:09:42 compute-1 sudo[185163]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:43 compute-1 sudo[185335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiflzdcxopvfxotxowwtzuhtgqivtuui ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989782.6503286-2768-216229241019142/AnsiballZ_edpm_container_manage.py'
Nov 24 13:09:43 compute-1 sudo[185335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:43 compute-1 podman[185289]: 2025-11-24 13:09:43.075344593 +0000 UTC m=+0.114911048 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:09:43 compute-1 podman[185290]: 2025-11-24 13:09:43.093012429 +0000 UTC m=+0.125745436 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:09:43 compute-1 python3[185347]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 13:09:43 compute-1 podman[185398]: 2025-11-24 13:09:43.524291635 +0000 UTC m=+0.064090418 container create da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:09:43 compute-1 podman[185398]: 2025-11-24 13:09:43.498760011 +0000 UTC m=+0.038558774 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 24 13:09:43 compute-1 python3[185347]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 24 13:09:43 compute-1 sudo[185335]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:44 compute-1 sudo[185587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgkgozjrxeqioxkmzkrojnkxbtixgeif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989783.9869716-2785-31504705328856/AnsiballZ_stat.py'
Nov 24 13:09:44 compute-1 sudo[185587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:44 compute-1 python3.9[185589]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:09:44 compute-1 sudo[185587]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:45 compute-1 sudo[185744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flvkhahusjmhoiayayqhnwsywrexmrps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989784.8817444-2802-136131886611831/AnsiballZ_file.py'
Nov 24 13:09:45 compute-1 sudo[185744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:45 compute-1 sshd-session[185592]: Invalid user support from 176.114.89.34 port 32896
Nov 24 13:09:45 compute-1 sshd-session[185592]: Received disconnect from 176.114.89.34 port 32896:11: Bye Bye [preauth]
Nov 24 13:09:45 compute-1 sshd-session[185592]: Disconnected from invalid user support 176.114.89.34 port 32896 [preauth]
Nov 24 13:09:45 compute-1 python3.9[185746]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:09:45 compute-1 sudo[185744]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:45 compute-1 sudo[185895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbaunxfwdhuzswbwpljnoueqsifodlwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989785.501441-2802-30343512939983/AnsiballZ_copy.py'
Nov 24 13:09:45 compute-1 sudo[185895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:46 compute-1 sshd-session[185461]: Invalid user tigergraph from 218.56.160.82 port 20079
Nov 24 13:09:46 compute-1 python3.9[185897]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763989785.501441-2802-30343512939983/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:09:46 compute-1 sudo[185895]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:46 compute-1 sshd-session[185461]: Received disconnect from 218.56.160.82 port 20079:11: Bye Bye [preauth]
Nov 24 13:09:46 compute-1 sshd-session[185461]: Disconnected from invalid user tigergraph 218.56.160.82 port 20079 [preauth]
Nov 24 13:09:46 compute-1 sudo[185971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inljexddfhnmtacclnsaqmnxxkesxmis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989785.501441-2802-30343512939983/AnsiballZ_systemd.py'
Nov 24 13:09:46 compute-1 sudo[185971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:46 compute-1 python3.9[185973]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:09:46 compute-1 systemd[1]: Reloading.
Nov 24 13:09:46 compute-1 systemd-rc-local-generator[185995]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:09:46 compute-1 systemd-sysv-generator[186002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:09:47 compute-1 sudo[185971]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:47 compute-1 sudo[186081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuvualzsulahhkjpdjwozxttjznvdlje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989785.501441-2802-30343512939983/AnsiballZ_systemd.py'
Nov 24 13:09:47 compute-1 sudo[186081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:48 compute-1 python3.9[186083]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:09:48 compute-1 systemd[1]: Reloading.
Nov 24 13:09:48 compute-1 systemd-rc-local-generator[186113]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:09:48 compute-1 systemd-sysv-generator[186116]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:09:48 compute-1 systemd[1]: Starting nova_compute container...
Nov 24 13:09:48 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:09:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:48 compute-1 podman[186123]: 2025-11-24 13:09:48.597927087 +0000 UTC m=+0.117190530 container init da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 24 13:09:48 compute-1 podman[186123]: 2025-11-24 13:09:48.60562495 +0000 UTC m=+0.124888363 container start da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 13:09:48 compute-1 nova_compute[186138]: + sudo -E kolla_set_configs
Nov 24 13:09:48 compute-1 podman[186123]: nova_compute
Nov 24 13:09:48 compute-1 systemd[1]: Started nova_compute container.
Nov 24 13:09:48 compute-1 sudo[186081]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Validating config file
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying service configuration files
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Deleting /etc/ceph
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Creating directory /etc/ceph
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /etc/ceph
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Writing out command to execute
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 13:09:48 compute-1 nova_compute[186138]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 13:09:48 compute-1 nova_compute[186138]: ++ cat /run_command
Nov 24 13:09:48 compute-1 nova_compute[186138]: + CMD=nova-compute
Nov 24 13:09:48 compute-1 nova_compute[186138]: + ARGS=
Nov 24 13:09:48 compute-1 nova_compute[186138]: + sudo kolla_copy_cacerts
Nov 24 13:09:48 compute-1 nova_compute[186138]: + [[ ! -n '' ]]
Nov 24 13:09:48 compute-1 nova_compute[186138]: + . kolla_extend_start
Nov 24 13:09:48 compute-1 nova_compute[186138]: Running command: 'nova-compute'
Nov 24 13:09:48 compute-1 nova_compute[186138]: + echo 'Running command: '\''nova-compute'\'''
Nov 24 13:09:48 compute-1 nova_compute[186138]: + umask 0022
Nov 24 13:09:48 compute-1 nova_compute[186138]: + exec nova-compute
Nov 24 13:09:50 compute-1 python3.9[186302]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:09:50 compute-1 sshd-session[186174]: Invalid user carlos from 68.183.82.237 port 59280
Nov 24 13:09:50 compute-1 sshd-session[186174]: Received disconnect from 68.183.82.237 port 59280:11: Bye Bye [preauth]
Nov 24 13:09:50 compute-1 sshd-session[186174]: Disconnected from invalid user carlos 68.183.82.237 port 59280 [preauth]
Nov 24 13:09:50 compute-1 nova_compute[186138]: 2025-11-24 13:09:50.667 186142 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 13:09:50 compute-1 nova_compute[186138]: 2025-11-24 13:09:50.667 186142 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 13:09:50 compute-1 nova_compute[186138]: 2025-11-24 13:09:50.668 186142 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 13:09:50 compute-1 nova_compute[186138]: 2025-11-24 13:09:50.668 186142 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 24 13:09:50 compute-1 nova_compute[186138]: 2025-11-24 13:09:50.844 186142 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:09:50 compute-1 nova_compute[186138]: 2025-11-24 13:09:50.868 186142 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:09:50 compute-1 nova_compute[186138]: 2025-11-24 13:09:50.869 186142 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 24 13:09:50 compute-1 python3.9[186454]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.481 186142 INFO nova.virt.driver [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.576 186142 INFO nova.compute.provider_config [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.588 186142 DEBUG oslo_concurrency.lockutils [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.588 186142 DEBUG oslo_concurrency.lockutils [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.589 186142 DEBUG oslo_concurrency.lockutils [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.589 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.589 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.589 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.590 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.590 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.590 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.590 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.590 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.590 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.590 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.591 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.591 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.591 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.591 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.591 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.591 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.591 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.592 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.592 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.592 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.592 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.592 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.592 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.592 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.593 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.593 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.593 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.593 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.593 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.593 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.593 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.594 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.594 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.594 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.594 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.594 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.594 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.594 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.595 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.595 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.595 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.595 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.596 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.596 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.596 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.596 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.596 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.596 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.596 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.597 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.597 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.597 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.597 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.597 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.597 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.597 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.598 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.598 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.598 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.598 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.598 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.598 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.598 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.599 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.599 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.599 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.599 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.599 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.599 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.599 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.599 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.600 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.600 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.600 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.600 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.600 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.600 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.600 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.601 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.601 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.601 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.601 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.601 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.601 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.601 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.602 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.602 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.602 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.602 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.602 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.602 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.602 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.602 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.603 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.603 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.603 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.603 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.603 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.603 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.603 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.604 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.604 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.604 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.604 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.604 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.604 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.604 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.605 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.605 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.605 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.605 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.605 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.605 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.605 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.605 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.606 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.606 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.606 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.606 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.606 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.606 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.606 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.606 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.607 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.607 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.607 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.607 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.607 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.607 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.607 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.608 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.608 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.608 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.608 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.608 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.608 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.608 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.609 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.609 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.609 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.609 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.609 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.609 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.609 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.609 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.610 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.610 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.610 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.610 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.610 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.610 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.610 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.611 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.611 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.611 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.611 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.611 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.612 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.612 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.612 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.612 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.612 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.612 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.612 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.613 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.613 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.613 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.613 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.613 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.613 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.613 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.614 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.614 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.614 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.614 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.614 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.614 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.614 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.615 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.615 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.615 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.615 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.615 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.615 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.615 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.615 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.616 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.616 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.616 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.616 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.616 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.616 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.616 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.617 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.617 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.617 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.617 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.617 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.617 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.617 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.618 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.618 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.618 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.618 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.618 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.618 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.618 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.619 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.619 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.619 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.619 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.619 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.619 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.619 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.620 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.620 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.620 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.620 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.620 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.620 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.620 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.620 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.621 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.621 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.621 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.621 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.621 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.621 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.621 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.622 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.622 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.622 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.622 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.622 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.622 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.622 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.623 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.623 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.623 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.623 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.623 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.623 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.623 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.624 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.624 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.624 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.624 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.624 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.624 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.624 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.625 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.625 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.625 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.625 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.625 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.625 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.625 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.625 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.626 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.626 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.626 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.626 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.626 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.626 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.626 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.627 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.627 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.627 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.627 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.627 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.627 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.627 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.628 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.628 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.628 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.628 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.628 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.628 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.628 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.629 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.629 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.629 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.629 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.629 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.629 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.629 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.630 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.630 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.630 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.630 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.630 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.630 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.630 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.630 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.631 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.631 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.631 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.631 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.631 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.631 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.631 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.632 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.632 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.632 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.632 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.632 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.632 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.632 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.633 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.633 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.633 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.633 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.633 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.633 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.633 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.633 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.634 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.634 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.634 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.634 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.634 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.634 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.634 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.635 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.635 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.635 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.635 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.635 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.635 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.635 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.636 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.636 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.636 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.636 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.636 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.636 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.637 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.637 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.637 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.637 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.637 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.637 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.637 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.638 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.638 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.638 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.638 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.638 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.639 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.639 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.639 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.639 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.639 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.639 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.639 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.640 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.640 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.640 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.640 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.640 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.640 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.641 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.641 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.641 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.641 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.641 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.641 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.642 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.642 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.642 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.642 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.642 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.642 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.642 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.642 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.643 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.643 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.643 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.643 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.643 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.643 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.643 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.644 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.644 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.644 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.644 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.644 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.644 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.645 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.645 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.645 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.645 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.645 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.645 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.646 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.646 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.646 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.646 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.646 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.646 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.647 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.647 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.647 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.647 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.647 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.647 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.648 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.648 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.648 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.648 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.648 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.648 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.648 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.649 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.649 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.649 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.649 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.649 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.649 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.649 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.650 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.650 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.650 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.650 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.650 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.650 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.651 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.651 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.651 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.651 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.651 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.651 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.652 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.652 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.652 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.652 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.652 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.652 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.652 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.652 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.653 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.653 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.653 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.653 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.653 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.653 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.654 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.654 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.654 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.654 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.654 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.654 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.654 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.655 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.655 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.655 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.655 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.655 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.655 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.656 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.656 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.656 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.656 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.656 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.656 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.656 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.657 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.657 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.657 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.657 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.657 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.657 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.658 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.658 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.658 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.658 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.658 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.658 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.658 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.659 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.659 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.659 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.659 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.659 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.660 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.660 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.660 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.660 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.660 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.660 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.661 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.661 186142 WARNING oslo_config.cfg [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 24 13:09:51 compute-1 nova_compute[186138]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 24 13:09:51 compute-1 nova_compute[186138]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 24 13:09:51 compute-1 nova_compute[186138]: and ``live_migration_inbound_addr`` respectively.
Nov 24 13:09:51 compute-1 nova_compute[186138]: ).  Its value may be silently ignored in the future.
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.661 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.661 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.661 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.662 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.662 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.662 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.662 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.662 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.662 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.662 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.663 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.663 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.663 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.663 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.663 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.663 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.664 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.664 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.664 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.664 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.664 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.664 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.664 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.665 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.665 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.665 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.665 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.665 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.665 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.665 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.666 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.666 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.666 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.666 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.666 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.666 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.666 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.667 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.667 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.667 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.667 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.667 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.667 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.667 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.668 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.668 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.668 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.668 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.668 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.668 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.668 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.669 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.669 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.669 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.669 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.669 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.669 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.669 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.670 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.670 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.670 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.670 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.670 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.670 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.670 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.671 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.671 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.671 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.671 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.671 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.671 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.671 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.672 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.672 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.672 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.672 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.672 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.672 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.672 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.672 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.673 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.673 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.673 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.673 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.673 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.673 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.674 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.674 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.674 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.674 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.674 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.674 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.674 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.674 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.675 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.675 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.675 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.675 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.675 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.675 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.675 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.676 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.676 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.676 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.676 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.676 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.676 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.676 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.677 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.677 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.677 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.677 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.677 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.677 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.677 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.678 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.678 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.678 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.678 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.678 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.678 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.678 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.678 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.679 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.679 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.679 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.679 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.679 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.679 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.679 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.680 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.680 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.680 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.680 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.680 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.680 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.680 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.681 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.681 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.681 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.681 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.681 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.681 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.682 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.682 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.682 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.682 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.682 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.682 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.682 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.682 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.683 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.683 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.683 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.683 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.683 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.683 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.684 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.684 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.684 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.684 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.684 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.684 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.684 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.685 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.685 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.685 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.685 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.685 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.685 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.685 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.686 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.686 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.686 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.686 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.686 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.686 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.686 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.687 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.687 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.687 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.687 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.687 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.687 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.688 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.688 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.688 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.688 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.688 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.688 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.688 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.689 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.689 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.689 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.689 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.689 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.689 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.690 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.690 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.690 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.690 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.690 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.690 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.690 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.690 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.691 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.691 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.691 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.691 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.691 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.691 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.691 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.692 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.692 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.692 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.692 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.692 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.692 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.692 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.693 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.693 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.693 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.693 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.693 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.693 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.693 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.694 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.694 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.694 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.694 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.694 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.694 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.694 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.694 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.695 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.695 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.695 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.695 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.695 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.695 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.695 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.696 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.696 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.696 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.696 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.696 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.696 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.696 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.697 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.697 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.697 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.697 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.697 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.697 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.697 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.698 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.698 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.698 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.698 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.698 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.698 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.698 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.698 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.699 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.699 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.699 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.699 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.699 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.699 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.700 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.700 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.700 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.700 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.700 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.700 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.700 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.701 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.701 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.701 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.701 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.701 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.701 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.701 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.701 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.702 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.702 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.702 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.702 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.702 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.702 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.702 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.703 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.703 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.703 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.703 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.703 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.703 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.703 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.704 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.704 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.704 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.704 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.704 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.704 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.704 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.705 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.705 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.705 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.705 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.705 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.705 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.705 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.706 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.706 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.706 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.706 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.706 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.706 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.706 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.707 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.707 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.707 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.707 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.707 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.707 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.707 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.707 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.708 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.708 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.708 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.708 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.708 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.708 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.708 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.709 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.709 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.709 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.709 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.709 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.709 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.709 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.710 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.710 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.710 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.710 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.710 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.710 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.710 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.711 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.711 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.711 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.711 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.711 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.711 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.711 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.712 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.712 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.712 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.712 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.712 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.712 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.712 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.712 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.713 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.713 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.713 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.713 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.713 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.713 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.713 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.714 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.714 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.714 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.714 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.714 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.714 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.714 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.714 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.715 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.715 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.715 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.715 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.715 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.715 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.715 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.716 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.716 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.716 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.716 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.716 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.716 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.716 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.716 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.717 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.717 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.717 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.717 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.717 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.717 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.717 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.718 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.718 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.718 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.718 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.718 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.718 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.718 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.718 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.719 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.719 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.719 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.719 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.719 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.719 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.719 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.720 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.720 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.720 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.720 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.720 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.720 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.720 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.721 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.721 186142 DEBUG oslo_service.service [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.721 186142 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.733 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.734 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.735 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.735 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 24 13:09:51 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Nov 24 13:09:51 compute-1 systemd[1]: Started libvirt QEMU daemon.
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.818 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f52def20be0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.822 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f52def20be0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.823 186142 INFO nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Connection event '1' reason 'None'
Nov 24 13:09:51 compute-1 python3.9[186606]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.840 186142 WARNING nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Nov 24 13:09:51 compute-1 nova_compute[186138]: 2025-11-24 13:09:51.840 186142 DEBUG nova.virt.libvirt.volume.mount [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.701 186142 INFO nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Libvirt host capabilities <capabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]: 
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <host>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <uuid>44bc1d66-b47f-487f-828a-c6d054feec1c</uuid>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <arch>x86_64</arch>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model>EPYC-Rome-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <vendor>AMD</vendor>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <microcode version='16777317'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <signature family='23' model='49' stepping='0'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='x2apic'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='tsc-deadline'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='osxsave'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='hypervisor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='tsc_adjust'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='spec-ctrl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='stibp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='arch-capabilities'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='cmp_legacy'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='topoext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='virt-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='lbrv'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='tsc-scale'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='vmcb-clean'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='pause-filter'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='pfthreshold'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='svme-addr-chk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='rdctl-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='skip-l1dfl-vmentry'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='mds-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature name='pschange-mc-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <pages unit='KiB' size='4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <pages unit='KiB' size='2048'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <pages unit='KiB' size='1048576'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <power_management>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <suspend_mem/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <suspend_disk/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <suspend_hybrid/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </power_management>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <iommu support='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <migration_features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <live/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <uri_transports>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <uri_transport>tcp</uri_transport>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <uri_transport>rdma</uri_transport>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </uri_transports>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </migration_features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <topology>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <cells num='1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <cell id='0'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:           <memory unit='KiB'>7864316</memory>
Nov 24 13:09:52 compute-1 nova_compute[186138]:           <pages unit='KiB' size='4'>1966079</pages>
Nov 24 13:09:52 compute-1 nova_compute[186138]:           <pages unit='KiB' size='2048'>0</pages>
Nov 24 13:09:52 compute-1 nova_compute[186138]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 24 13:09:52 compute-1 nova_compute[186138]:           <distances>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <sibling id='0' value='10'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:           </distances>
Nov 24 13:09:52 compute-1 nova_compute[186138]:           <cpus num='8'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:           </cpus>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         </cell>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </cells>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </topology>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <cache>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </cache>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <secmodel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model>selinux</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <doi>0</doi>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </secmodel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <secmodel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model>dac</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <doi>0</doi>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </secmodel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </host>
Nov 24 13:09:52 compute-1 nova_compute[186138]: 
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <guest>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <os_type>hvm</os_type>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <arch name='i686'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <wordsize>32</wordsize>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <domain type='qemu'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <domain type='kvm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </arch>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <pae/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <nonpae/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <acpi default='on' toggle='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <apic default='on' toggle='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <cpuselection/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <deviceboot/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <disksnapshot default='on' toggle='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <externalSnapshot/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </guest>
Nov 24 13:09:52 compute-1 nova_compute[186138]: 
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <guest>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <os_type>hvm</os_type>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <arch name='x86_64'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <wordsize>64</wordsize>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <domain type='qemu'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <domain type='kvm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </arch>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <acpi default='on' toggle='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <apic default='on' toggle='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <cpuselection/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <deviceboot/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <disksnapshot default='on' toggle='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <externalSnapshot/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </guest>
Nov 24 13:09:52 compute-1 nova_compute[186138]: 
Nov 24 13:09:52 compute-1 nova_compute[186138]: </capabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]: 
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.712 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.740 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 24 13:09:52 compute-1 nova_compute[186138]: <domainCapabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <domain>kvm</domain>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <arch>i686</arch>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <vcpu max='240'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <iothreads supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <os supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <enum name='firmware'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <loader supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>rom</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pflash</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='readonly'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>yes</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>no</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='secure'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>no</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </loader>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </os>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='host-passthrough' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='hostPassthroughMigratable'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>on</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>off</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='maximum' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='maximumMigratable'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>on</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>off</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='host-model' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <vendor>AMD</vendor>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='x2apic'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='hypervisor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='stibp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='overflow-recov'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='succor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='lbrv'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc-scale'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='flushbyasid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='pause-filter'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='pfthreshold'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='disable' name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='custom' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Dhyana-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Genoa'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='auto-ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='auto-ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-128'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-256'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-512'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v6'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v7'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='KnightsMill'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512er'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512pf'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='KnightsMill-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512er'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512pf'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G4-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tbm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G5-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tbm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SierraForest'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cmpccxadd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SierraForest-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cmpccxadd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='athlon'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='athlon-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='core2duo'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='core2duo-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='coreduo'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='coreduo-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='n270'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='n270-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='phenom'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='phenom-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <memoryBacking supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <enum name='sourceType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>file</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>anonymous</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>memfd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </memoryBacking>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <devices>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <disk supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='diskDevice'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>disk</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>cdrom</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>floppy</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>lun</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='bus'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ide</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>fdc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>scsi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>sata</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-non-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </disk>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <graphics supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vnc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>egl-headless</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dbus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </graphics>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <video supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='modelType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vga</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>cirrus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>none</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>bochs</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ramfb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </video>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <hostdev supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='mode'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>subsystem</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='startupPolicy'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>default</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>mandatory</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>requisite</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>optional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='subsysType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pci</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>scsi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='capsType'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='pciBackend'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </hostdev>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <rng supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-non-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>random</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>egd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>builtin</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </rng>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <filesystem supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='driverType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>path</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>handle</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtiofs</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </filesystem>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <tpm supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tpm-tis</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tpm-crb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>emulator</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>external</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendVersion'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>2.0</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </tpm>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <redirdev supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='bus'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </redirdev>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <channel supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pty</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>unix</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </channel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <crypto supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>qemu</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>builtin</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </crypto>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <interface supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>default</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>passt</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </interface>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <panic supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>isa</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>hyperv</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </panic>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <console supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>null</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pty</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dev</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>file</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pipe</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>stdio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>udp</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tcp</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>unix</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>qemu-vdagent</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dbus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </console>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </devices>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <gic supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <vmcoreinfo supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <genid supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <backingStoreInput supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <backup supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <async-teardown supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <ps2 supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <sev supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <sgx supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <hyperv supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='features'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>relaxed</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vapic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>spinlocks</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vpindex</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>runtime</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>synic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>stimer</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>reset</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vendor_id</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>frequencies</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>reenlightenment</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tlbflush</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ipi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>avic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>emsr_bitmap</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>xmm_input</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <defaults>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <spinlocks>4095</spinlocks>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <stimer_direct>on</stimer_direct>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </defaults>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </hyperv>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <launchSecurity supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='sectype'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tdx</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </launchSecurity>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </features>
Nov 24 13:09:52 compute-1 nova_compute[186138]: </domainCapabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.747 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 24 13:09:52 compute-1 nova_compute[186138]: <domainCapabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <domain>kvm</domain>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <arch>i686</arch>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <vcpu max='4096'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <iothreads supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <os supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <enum name='firmware'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <loader supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>rom</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pflash</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='readonly'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>yes</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>no</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='secure'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>no</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </loader>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </os>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='host-passthrough' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='hostPassthroughMigratable'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>on</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>off</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='maximum' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='maximumMigratable'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>on</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>off</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='host-model' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <vendor>AMD</vendor>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='x2apic'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='hypervisor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='stibp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='overflow-recov'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='succor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='lbrv'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc-scale'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='flushbyasid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='pause-filter'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='pfthreshold'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='disable' name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='custom' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Dhyana-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Genoa'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='auto-ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='auto-ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-128'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-256'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-512'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v6'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v7'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='KnightsMill'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512er'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512pf'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='KnightsMill-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512er'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512pf'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G4-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tbm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G5-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tbm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SierraForest'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cmpccxadd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SierraForest-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cmpccxadd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='athlon'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='athlon-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='core2duo'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='core2duo-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='coreduo'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='coreduo-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='n270'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='n270-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='phenom'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='phenom-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <memoryBacking supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <enum name='sourceType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>file</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>anonymous</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>memfd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </memoryBacking>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <devices>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <disk supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='diskDevice'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>disk</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>cdrom</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>floppy</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>lun</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='bus'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>fdc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>scsi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>sata</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-non-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </disk>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <graphics supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vnc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>egl-headless</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dbus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </graphics>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <video supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='modelType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vga</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>cirrus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>none</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>bochs</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ramfb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </video>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <hostdev supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='mode'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>subsystem</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='startupPolicy'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>default</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>mandatory</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>requisite</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>optional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='subsysType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pci</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>scsi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='capsType'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='pciBackend'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </hostdev>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <rng supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-non-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>random</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>egd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>builtin</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </rng>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <filesystem supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='driverType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>path</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>handle</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtiofs</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </filesystem>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <tpm supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tpm-tis</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tpm-crb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>emulator</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>external</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendVersion'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>2.0</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </tpm>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <redirdev supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='bus'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </redirdev>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <channel supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pty</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>unix</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </channel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <crypto supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>qemu</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>builtin</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </crypto>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <interface supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>default</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>passt</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </interface>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <panic supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>isa</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>hyperv</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </panic>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <console supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>null</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pty</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dev</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>file</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pipe</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>stdio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>udp</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tcp</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>unix</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>qemu-vdagent</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dbus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </console>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </devices>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <gic supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <vmcoreinfo supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <genid supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <backingStoreInput supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <backup supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <async-teardown supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <ps2 supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <sev supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <sgx supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <hyperv supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='features'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>relaxed</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vapic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>spinlocks</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vpindex</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>runtime</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>synic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>stimer</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>reset</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vendor_id</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>frequencies</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>reenlightenment</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tlbflush</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ipi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>avic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>emsr_bitmap</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>xmm_input</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <defaults>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <spinlocks>4095</spinlocks>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <stimer_direct>on</stimer_direct>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </defaults>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </hyperv>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <launchSecurity supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='sectype'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tdx</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </launchSecurity>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </features>
Nov 24 13:09:52 compute-1 nova_compute[186138]: </domainCapabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.773 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.778 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 24 13:09:52 compute-1 nova_compute[186138]: <domainCapabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <domain>kvm</domain>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <arch>x86_64</arch>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <vcpu max='240'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <iothreads supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <os supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <enum name='firmware'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <loader supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>rom</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pflash</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='readonly'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>yes</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>no</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='secure'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>no</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </loader>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </os>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='host-passthrough' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='hostPassthroughMigratable'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>on</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>off</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='maximum' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='maximumMigratable'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>on</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>off</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='host-model' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <vendor>AMD</vendor>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='x2apic'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='hypervisor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='stibp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='overflow-recov'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='succor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='lbrv'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc-scale'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='flushbyasid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='pause-filter'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='pfthreshold'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='disable' name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='custom' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Dhyana-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Genoa'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='auto-ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='auto-ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-128'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-256'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-512'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v6'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v7'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='KnightsMill'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512er'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512pf'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='KnightsMill-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512er'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512pf'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G4-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tbm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G5-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tbm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SierraForest'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cmpccxadd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SierraForest-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cmpccxadd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='athlon'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='athlon-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='core2duo'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='core2duo-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='coreduo'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='coreduo-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='n270'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='n270-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='phenom'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='phenom-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <memoryBacking supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <enum name='sourceType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>file</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>anonymous</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>memfd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </memoryBacking>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <devices>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <disk supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='diskDevice'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>disk</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>cdrom</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>floppy</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>lun</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='bus'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ide</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>fdc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>scsi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>sata</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-non-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </disk>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <graphics supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vnc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>egl-headless</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dbus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </graphics>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <video supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='modelType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vga</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>cirrus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>none</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>bochs</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ramfb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </video>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <hostdev supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='mode'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>subsystem</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='startupPolicy'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>default</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>mandatory</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>requisite</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>optional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='subsysType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pci</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>scsi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='capsType'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='pciBackend'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </hostdev>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <rng supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-non-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>random</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>egd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>builtin</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </rng>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <filesystem supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='driverType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>path</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>handle</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtiofs</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </filesystem>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <tpm supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tpm-tis</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tpm-crb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>emulator</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>external</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendVersion'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>2.0</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </tpm>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <redirdev supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='bus'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </redirdev>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <channel supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pty</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>unix</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </channel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <crypto supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>qemu</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>builtin</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </crypto>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <interface supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>default</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>passt</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </interface>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <panic supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>isa</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>hyperv</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </panic>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <console supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>null</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pty</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dev</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>file</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pipe</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>stdio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>udp</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tcp</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>unix</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>qemu-vdagent</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dbus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </console>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </devices>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <gic supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <vmcoreinfo supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <genid supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <backingStoreInput supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <backup supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <async-teardown supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <ps2 supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <sev supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <sgx supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <hyperv supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='features'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>relaxed</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vapic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>spinlocks</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vpindex</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>runtime</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>synic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>stimer</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>reset</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vendor_id</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>frequencies</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>reenlightenment</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tlbflush</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ipi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>avic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>emsr_bitmap</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>xmm_input</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <defaults>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <spinlocks>4095</spinlocks>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <stimer_direct>on</stimer_direct>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </defaults>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </hyperv>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <launchSecurity supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='sectype'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tdx</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </launchSecurity>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </features>
Nov 24 13:09:52 compute-1 nova_compute[186138]: </domainCapabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.838 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 24 13:09:52 compute-1 nova_compute[186138]: <domainCapabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <domain>kvm</domain>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <arch>x86_64</arch>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <vcpu max='4096'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <iothreads supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <os supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <enum name='firmware'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>efi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <loader supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>rom</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pflash</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='readonly'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>yes</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>no</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='secure'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>yes</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>no</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </loader>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </os>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='host-passthrough' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='hostPassthroughMigratable'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>on</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>off</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='maximum' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='maximumMigratable'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>on</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>off</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='host-model' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <vendor>AMD</vendor>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='x2apic'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='hypervisor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='stibp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='overflow-recov'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='succor'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='lbrv'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='tsc-scale'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='flushbyasid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='pause-filter'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='pfthreshold'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <feature policy='disable' name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <mode name='custom' supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Broadwell-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Cooperlake-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Denverton-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Dhyana-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Genoa'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='auto-ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='auto-ibrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Milan-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amd-psfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='stibp-always-on'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-Rome-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='EPYC-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='GraniteRapids-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-128'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-256'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx10-512'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='prefetchiti'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Haswell-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v6'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Icelake-Server-v7'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='IvyBridge-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='KnightsMill'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512er'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512pf'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='KnightsMill-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512er'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512pf'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G4-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tbm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Opteron_G5-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fma4'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tbm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xop'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SapphireRapids-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='amx-tile'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-bf16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-fp16'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bitalg'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrc'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fzrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='la57'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='taa-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xfd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SierraForest'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cmpccxadd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='SierraForest-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ifma'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cmpccxadd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fbsdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='fsrs'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ibrs-all'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mcdt-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pbrsb-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='psdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='serialize'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vaes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Client-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='hle'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='rtm'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Skylake-Server-v5'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512bw'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512cd'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512dq'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512f'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='avx512vl'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='invpcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pcid'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='pku'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='mpx'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v2'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v3'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='core-capability'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='split-lock-detect'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='Snowridge-v4'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='cldemote'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='erms'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='gfni'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdir64b'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='movdiri'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='xsaves'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='athlon'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='athlon-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='core2duo'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='core2duo-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='coreduo'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='coreduo-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='n270'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='n270-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='ss'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='phenom'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <blockers model='phenom-v1'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnow'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <feature name='3dnowext'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </blockers>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </mode>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <memoryBacking supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <enum name='sourceType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>file</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>anonymous</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <value>memfd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </memoryBacking>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <devices>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <disk supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='diskDevice'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>disk</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>cdrom</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>floppy</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>lun</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='bus'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>fdc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>scsi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>sata</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-non-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </disk>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <graphics supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vnc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>egl-headless</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dbus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </graphics>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <video supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='modelType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vga</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>cirrus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>none</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>bochs</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ramfb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </video>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <hostdev supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='mode'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>subsystem</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='startupPolicy'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>default</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>mandatory</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>requisite</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>optional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='subsysType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pci</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>scsi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='capsType'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='pciBackend'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </hostdev>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <rng supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtio-non-transitional</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>random</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>egd</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>builtin</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </rng>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <filesystem supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='driverType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>path</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>handle</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>virtiofs</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </filesystem>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <tpm supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tpm-tis</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tpm-crb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>emulator</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>external</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendVersion'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>2.0</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </tpm>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <redirdev supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='bus'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>usb</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </redirdev>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <channel supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pty</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>unix</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </channel>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <crypto supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>qemu</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendModel'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>builtin</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </crypto>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <interface supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='backendType'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>default</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>passt</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </interface>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <panic supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='model'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>isa</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>hyperv</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </panic>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <console supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='type'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>null</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vc</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pty</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dev</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>file</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>pipe</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>stdio</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>udp</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tcp</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>unix</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>qemu-vdagent</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>dbus</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </console>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </devices>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <features>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <gic supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <vmcoreinfo supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <genid supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <backingStoreInput supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <backup supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <async-teardown supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <ps2 supported='yes'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <sev supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <sgx supported='no'/>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <hyperv supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='features'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>relaxed</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vapic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>spinlocks</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vpindex</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>runtime</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>synic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>stimer</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>reset</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>vendor_id</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>frequencies</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>reenlightenment</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tlbflush</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>ipi</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>avic</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>emsr_bitmap</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>xmm_input</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <defaults>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <spinlocks>4095</spinlocks>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <stimer_direct>on</stimer_direct>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </defaults>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </hyperv>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     <launchSecurity supported='yes'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       <enum name='sectype'>
Nov 24 13:09:52 compute-1 nova_compute[186138]:         <value>tdx</value>
Nov 24 13:09:52 compute-1 nova_compute[186138]:       </enum>
Nov 24 13:09:52 compute-1 nova_compute[186138]:     </launchSecurity>
Nov 24 13:09:52 compute-1 nova_compute[186138]:   </features>
Nov 24 13:09:52 compute-1 nova_compute[186138]: </domainCapabilities>
Nov 24 13:09:52 compute-1 nova_compute[186138]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.901 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.901 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.902 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.902 186142 INFO nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Secure Boot support detected
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.905 186142 INFO nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.905 186142 INFO nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.922 186142 DEBUG nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] cpu compare xml: <cpu match="exact">
Nov 24 13:09:52 compute-1 nova_compute[186138]:   <model>Nehalem</model>
Nov 24 13:09:52 compute-1 nova_compute[186138]: </cpu>
Nov 24 13:09:52 compute-1 nova_compute[186138]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.925 186142 DEBUG nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.955 186142 INFO nova.virt.node [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Determined node identity ece8f004-1d5b-407f-a713-f9e87706b045 from /var/lib/nova/compute_id
Nov 24 13:09:52 compute-1 nova_compute[186138]: 2025-11-24 13:09:52.974 186142 WARNING nova.compute.manager [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Compute nodes ['ece8f004-1d5b-407f-a713-f9e87706b045'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.003 186142 INFO nova.compute.manager [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 24 13:09:53 compute-1 sudo[186820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjfubctanodhxvgjpzdzzvtiooksziwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989792.2448614-2922-171915810331924/AnsiballZ_podman_container.py'
Nov 24 13:09:53 compute-1 sudo[186820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.032 186142 WARNING nova.compute.manager [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.032 186142 DEBUG oslo_concurrency.lockutils [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.032 186142 DEBUG oslo_concurrency.lockutils [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.033 186142 DEBUG oslo_concurrency.lockutils [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.033 186142 DEBUG nova.compute.resource_tracker [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:09:53 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Nov 24 13:09:53 compute-1 systemd[1]: Started libvirt nodedev daemon.
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.320 186142 WARNING nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.321 186142 DEBUG nova.compute.resource_tracker [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6186MB free_disk=73.66805267333984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.321 186142 DEBUG oslo_concurrency.lockutils [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.321 186142 DEBUG oslo_concurrency.lockutils [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:09:53 compute-1 python3.9[186822]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.373 186142 WARNING nova.compute.resource_tracker [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] No compute node record for compute-1.ctlplane.example.com:ece8f004-1d5b-407f-a713-f9e87706b045: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host ece8f004-1d5b-407f-a713-f9e87706b045 could not be found.
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.426 186142 INFO nova.compute.resource_tracker [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: ece8f004-1d5b-407f-a713-f9e87706b045
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.483 186142 DEBUG nova.compute.resource_tracker [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:09:53 compute-1 nova_compute[186138]: 2025-11-24 13:09:53.484 186142 DEBUG nova.compute.resource_tracker [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:09:53 compute-1 sudo[186820]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:53 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 13:09:54 compute-1 sudo[187015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-writbdycmenaadowtizjhjiyfrqevxsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989793.7886572-2938-189341061604768/AnsiballZ_systemd.py'
Nov 24 13:09:54 compute-1 sudo[187015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:54 compute-1 python3.9[187017]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.434 186142 INFO nova.scheduler.client.report [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] [req-81563321-74ac-450a-a3ca-79421e9b27cd] Created resource provider record via placement API for resource provider with UUID ece8f004-1d5b-407f-a713-f9e87706b045 and name compute-1.ctlplane.example.com.
Nov 24 13:09:54 compute-1 systemd[1]: Stopping nova_compute container...
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.484 186142 DEBUG nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 24 13:09:54 compute-1 nova_compute[186138]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.485 186142 INFO nova.virt.libvirt.host [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] kernel doesn't support AMD SEV
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.486 186142 DEBUG nova.compute.provider_tree [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.486 186142 DEBUG nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.491 186142 DEBUG nova.virt.libvirt.driver [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Libvirt baseline CPU <cpu>
Nov 24 13:09:54 compute-1 nova_compute[186138]:   <arch>x86_64</arch>
Nov 24 13:09:54 compute-1 nova_compute[186138]:   <model>Nehalem</model>
Nov 24 13:09:54 compute-1 nova_compute[186138]:   <vendor>AMD</vendor>
Nov 24 13:09:54 compute-1 nova_compute[186138]:   <topology sockets="8" cores="1" threads="1"/>
Nov 24 13:09:54 compute-1 nova_compute[186138]: </cpu>
Nov 24 13:09:54 compute-1 nova_compute[186138]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.573 186142 DEBUG nova.scheduler.client.report [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Updated inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.574 186142 DEBUG nova.compute.provider_tree [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.574 186142 DEBUG nova.compute.provider_tree [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.655 186142 DEBUG nova.compute.provider_tree [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.660 186142 DEBUG oslo_concurrency.lockutils [None req-25794a70-cd46-4eb9-b1c2-d9bb9dc9e27f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.661 186142 DEBUG oslo_concurrency.lockutils [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.661 186142 DEBUG oslo_concurrency.lockutils [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:09:54 compute-1 nova_compute[186138]: 2025-11-24 13:09:54.661 186142 DEBUG oslo_concurrency.lockutils [None req-2e990322-5988-435e-910f-3ce3f71f871d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:09:55 compute-1 virtqemud[186628]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 24 13:09:55 compute-1 virtqemud[186628]: hostname: compute-1
Nov 24 13:09:55 compute-1 virtqemud[186628]: End of file while reading data: Input/output error
Nov 24 13:09:55 compute-1 systemd[1]: libpod-da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3.scope: Deactivated successfully.
Nov 24 13:09:55 compute-1 systemd[1]: libpod-da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3.scope: Consumed 3.203s CPU time.
Nov 24 13:09:55 compute-1 podman[187021]: 2025-11-24 13:09:55.053487535 +0000 UTC m=+0.568058176 container died da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:09:55 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3-userdata-shm.mount: Deactivated successfully.
Nov 24 13:09:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679-merged.mount: Deactivated successfully.
Nov 24 13:09:55 compute-1 podman[187021]: 2025-11-24 13:09:55.201416912 +0000 UTC m=+0.715987553 container cleanup da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:09:55 compute-1 podman[187021]: nova_compute
Nov 24 13:09:55 compute-1 podman[187050]: nova_compute
Nov 24 13:09:55 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 24 13:09:55 compute-1 systemd[1]: Stopped nova_compute container.
Nov 24 13:09:55 compute-1 systemd[1]: Starting nova_compute container...
Nov 24 13:09:55 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:09:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f82ea6b9d2dbf689b70d349e1fd1ebfc3ff220401cded3244bd16192b6e679/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:55 compute-1 podman[187063]: 2025-11-24 13:09:55.401070144 +0000 UTC m=+0.101944901 container init da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, container_name=nova_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:09:55 compute-1 podman[187063]: 2025-11-24 13:09:55.411710207 +0000 UTC m=+0.112584954 container start da33b8f361c99e295f1ef1acd428c126080cf99d3058d728c21fcb39328991e3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 13:09:55 compute-1 podman[187063]: nova_compute
Nov 24 13:09:55 compute-1 nova_compute[187078]: + sudo -E kolla_set_configs
Nov 24 13:09:55 compute-1 systemd[1]: Started nova_compute container.
Nov 24 13:09:55 compute-1 sudo[187015]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Validating config file
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying service configuration files
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /etc/ceph
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Creating directory /etc/ceph
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /etc/ceph
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Writing out command to execute
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 13:09:55 compute-1 nova_compute[187078]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 13:09:55 compute-1 nova_compute[187078]: ++ cat /run_command
Nov 24 13:09:55 compute-1 nova_compute[187078]: + CMD=nova-compute
Nov 24 13:09:55 compute-1 nova_compute[187078]: + ARGS=
Nov 24 13:09:55 compute-1 nova_compute[187078]: + sudo kolla_copy_cacerts
Nov 24 13:09:55 compute-1 nova_compute[187078]: + [[ ! -n '' ]]
Nov 24 13:09:55 compute-1 nova_compute[187078]: + . kolla_extend_start
Nov 24 13:09:55 compute-1 nova_compute[187078]: Running command: 'nova-compute'
Nov 24 13:09:55 compute-1 nova_compute[187078]: + echo 'Running command: '\''nova-compute'\'''
Nov 24 13:09:55 compute-1 nova_compute[187078]: + umask 0022
Nov 24 13:09:55 compute-1 nova_compute[187078]: + exec nova-compute
Nov 24 13:09:56 compute-1 sudo[187239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsvlqukiofablcfcgtxfittlokkdisxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989795.7604432-2956-220545893723869/AnsiballZ_podman_container.py'
Nov 24 13:09:56 compute-1 sudo[187239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:09:56 compute-1 python3.9[187241]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 24 13:09:56 compute-1 systemd[1]: Started libpod-conmon-90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26.scope.
Nov 24 13:09:56 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:09:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f246400d741d64ffadbf5895063c3ad74d73ad70be1b1024c908104ba35bba2b/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f246400d741d64ffadbf5895063c3ad74d73ad70be1b1024c908104ba35bba2b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f246400d741d64ffadbf5895063c3ad74d73ad70be1b1024c908104ba35bba2b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 24 13:09:56 compute-1 podman[187268]: 2025-11-24 13:09:56.636452899 +0000 UTC m=+0.176531426 container init 90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:09:56 compute-1 podman[187268]: 2025-11-24 13:09:56.645469858 +0000 UTC m=+0.185548355 container start 90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251118)
Nov 24 13:09:56 compute-1 python3.9[187241]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Applying nova statedir ownership
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 24 13:09:56 compute-1 nova_compute_init[187291]: INFO:nova_statedir:Nova statedir ownership complete
Nov 24 13:09:56 compute-1 systemd[1]: libpod-90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26.scope: Deactivated successfully.
Nov 24 13:09:56 compute-1 podman[187305]: 2025-11-24 13:09:56.773873546 +0000 UTC m=+0.030049959 container died 90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:09:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26-userdata-shm.mount: Deactivated successfully.
Nov 24 13:09:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-f246400d741d64ffadbf5895063c3ad74d73ad70be1b1024c908104ba35bba2b-merged.mount: Deactivated successfully.
Nov 24 13:09:56 compute-1 podman[187305]: 2025-11-24 13:09:56.817697634 +0000 UTC m=+0.073874007 container cleanup 90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 24 13:09:56 compute-1 systemd[1]: libpod-conmon-90af41758584de3c883458b871d182d35321f0b542212db88d3c2e6a5f375f26.scope: Deactivated successfully.
Nov 24 13:09:56 compute-1 sudo[187239]: pam_unix(sudo:session): session closed for user root
Nov 24 13:09:57 compute-1 sshd-session[158907]: Connection closed by 192.168.122.30 port 35734
Nov 24 13:09:57 compute-1 sshd-session[158904]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:09:57 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Nov 24 13:09:57 compute-1 systemd[1]: session-25.scope: Consumed 1min 56.969s CPU time.
Nov 24 13:09:57 compute-1 systemd-logind[815]: Session 25 logged out. Waiting for processes to exit.
Nov 24 13:09:57 compute-1 systemd-logind[815]: Removed session 25.
Nov 24 13:09:57 compute-1 nova_compute[187078]: 2025-11-24 13:09:57.506 187082 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 13:09:57 compute-1 nova_compute[187078]: 2025-11-24 13:09:57.507 187082 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 13:09:57 compute-1 nova_compute[187078]: 2025-11-24 13:09:57.507 187082 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 13:09:57 compute-1 nova_compute[187078]: 2025-11-24 13:09:57.507 187082 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 24 13:09:57 compute-1 nova_compute[187078]: 2025-11-24 13:09:57.640 187082 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:09:57 compute-1 nova_compute[187078]: 2025-11-24 13:09:57.654 187082 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:09:57 compute-1 nova_compute[187078]: 2025-11-24 13:09:57.654 187082 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.105 187082 INFO nova.virt.driver [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.232 187082 INFO nova.compute.provider_config [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.238 187082 DEBUG oslo_concurrency.lockutils [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.238 187082 DEBUG oslo_concurrency.lockutils [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.238 187082 DEBUG oslo_concurrency.lockutils [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.239 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.239 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.239 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.239 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.239 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.239 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.240 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.240 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.240 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.240 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.240 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.240 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.240 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.241 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.241 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.241 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.241 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.241 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.241 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.241 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.242 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.242 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.242 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.242 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.242 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.242 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.243 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.243 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.243 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.243 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.243 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.243 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.243 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.244 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.244 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.244 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.244 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.244 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.244 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.245 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.245 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.245 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.245 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.245 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.245 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.245 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.246 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.246 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.246 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.246 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.246 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.246 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.246 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.247 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.247 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.247 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.247 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.247 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.247 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.247 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.247 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.248 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.248 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.248 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.248 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.248 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.248 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.248 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.249 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.249 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.249 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.249 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.249 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.249 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.249 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.250 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.250 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.250 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.250 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.250 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.250 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.250 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.251 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.251 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.251 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.251 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.251 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.251 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.251 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.252 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.252 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.252 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.252 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.252 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.252 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.252 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.252 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.253 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.253 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.253 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.253 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.253 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.253 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.253 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.254 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.254 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.254 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.254 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.254 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.254 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.254 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.255 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.255 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.255 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.255 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.255 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.255 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.255 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.255 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.256 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.256 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.256 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.256 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.256 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.256 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.256 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.257 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.257 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.257 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.257 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.257 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.257 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.257 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.258 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.258 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.258 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.258 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.258 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.258 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.258 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.258 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.259 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.259 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.259 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.259 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.259 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.259 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.259 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.260 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.260 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.260 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.260 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.260 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.260 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.260 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.261 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.261 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.261 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.261 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.261 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.261 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.261 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.262 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.262 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.262 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.262 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.262 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.262 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.262 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.263 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.263 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.263 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.263 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.263 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.263 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.263 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.264 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.264 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.264 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.264 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.264 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.264 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.264 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.265 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.265 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.265 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.265 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.265 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.265 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.265 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.266 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.266 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.266 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.266 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.266 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.266 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.266 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.267 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.267 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.267 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.267 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.267 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.267 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.267 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.268 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.268 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.268 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.268 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.268 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.268 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.268 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.268 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.269 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.269 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.269 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.269 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.269 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.269 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.269 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.270 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.270 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.270 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.270 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.270 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.270 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.270 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.271 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.271 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.271 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.271 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.271 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.271 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.271 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.272 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.272 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.272 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.272 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.272 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.272 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.272 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.273 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.273 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.273 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.273 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.273 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.273 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.273 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.273 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.274 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.274 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.274 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.274 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.274 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.274 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.274 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.275 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.275 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.275 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.275 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.275 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.275 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.275 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.276 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.276 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.276 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.276 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.276 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.276 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.276 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.277 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.277 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.277 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.277 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.277 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.277 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.277 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.278 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.278 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.278 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.278 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.278 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.278 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.278 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.279 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.279 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.279 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.279 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.279 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.279 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.279 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.280 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.280 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.280 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.280 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.280 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.280 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.280 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.280 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.281 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.281 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.281 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.281 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.281 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.281 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.281 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.282 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.282 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.282 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.282 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.282 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.282 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.282 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.283 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.283 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.283 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.283 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.283 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.283 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.283 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.283 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.284 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.284 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.284 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.284 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.284 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.284 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.284 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.285 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.285 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.285 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.285 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.285 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.285 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.285 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.286 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.286 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.286 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.286 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.286 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.286 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.287 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.287 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.287 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.287 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.287 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.287 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.287 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.288 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.288 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.288 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.288 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.288 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.288 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.288 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.288 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.289 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.289 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.289 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.289 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.289 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.290 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.290 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.290 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.290 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.290 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.290 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.290 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.290 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.291 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.291 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.291 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.291 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.291 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.291 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.292 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.292 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.292 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.292 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.292 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.292 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.292 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.293 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.293 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.293 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.293 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.293 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.294 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.294 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.294 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.294 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.294 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.295 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.295 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.295 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.295 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.295 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.296 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.296 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.296 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.296 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.296 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.296 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.297 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.297 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.297 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.297 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.297 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.297 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.298 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.298 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.298 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.298 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.298 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.298 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.299 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.299 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.299 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.299 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.299 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.300 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.300 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.300 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.300 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.300 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.300 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.300 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.301 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.301 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.301 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.301 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.301 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.301 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.302 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.302 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.302 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.302 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.302 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.302 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.302 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.303 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.303 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.303 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.303 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.303 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.303 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.303 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.304 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.304 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.304 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.304 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.304 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.304 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.304 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.305 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.305 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.305 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.305 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.305 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.305 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.305 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.306 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.306 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.306 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.306 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.306 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.306 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.306 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.307 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.307 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.307 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.307 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.307 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.307 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.307 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.308 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.308 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.308 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.308 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.308 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.308 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.309 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.309 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.309 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.309 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.309 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.309 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.309 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.309 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.310 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.310 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.310 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.310 187082 WARNING oslo_config.cfg [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 24 13:09:58 compute-1 nova_compute[187078]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 24 13:09:58 compute-1 nova_compute[187078]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 24 13:09:58 compute-1 nova_compute[187078]: and ``live_migration_inbound_addr`` respectively.
Nov 24 13:09:58 compute-1 nova_compute[187078]: ).  Its value may be silently ignored in the future.
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.310 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.311 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.311 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.311 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.311 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.311 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.311 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.311 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.312 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.312 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.312 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.312 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.312 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.312 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.312 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.313 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.313 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.313 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.313 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.313 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.313 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.313 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.314 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.314 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.314 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.314 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.314 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.314 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.314 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.314 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.315 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.315 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.315 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.315 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.315 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.315 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.316 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.316 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.316 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.316 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.316 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.316 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.316 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.317 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.317 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.317 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.317 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.317 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.317 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.317 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.318 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.318 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.318 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.318 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.318 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.318 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.318 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.319 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.319 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.319 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.319 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.319 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.319 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.319 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.320 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.320 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.320 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.320 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.320 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.320 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.320 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.321 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.321 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.321 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.321 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.321 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.321 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.321 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.322 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.322 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.322 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.322 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.322 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.322 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.322 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.323 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.323 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.323 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.323 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.323 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.323 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.323 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.324 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.324 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.324 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.324 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.324 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.324 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.324 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.324 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.325 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.325 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.325 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.325 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.325 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.325 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.325 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.326 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.326 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.326 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.326 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.326 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.326 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.327 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.327 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.327 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.327 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.327 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.327 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.327 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.328 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.328 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.328 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.328 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.328 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.328 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.328 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.329 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.329 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.329 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.329 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.329 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.329 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.329 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.330 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.330 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.330 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.330 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.330 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.330 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.331 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.331 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.331 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.331 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.331 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.331 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.332 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.332 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.332 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.332 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.332 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.332 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.333 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.333 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.333 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.333 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.333 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.333 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.333 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.334 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.334 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.334 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.334 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.334 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.334 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.334 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.335 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.335 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.335 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.335 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.335 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.335 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.335 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.336 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.336 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.336 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.336 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.336 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.336 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.336 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.337 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.337 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.337 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.337 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.337 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.337 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.337 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.338 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.338 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.338 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.338 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.338 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.338 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.338 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.339 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.339 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.339 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.339 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.339 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.339 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.340 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.340 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.340 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.340 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.340 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.340 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.340 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.341 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.341 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.341 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.341 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.341 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.341 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.341 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.342 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.342 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.342 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.342 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.342 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.342 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.342 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.343 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.343 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.343 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.343 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.343 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.344 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.344 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.344 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.344 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.344 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.345 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.345 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.345 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.345 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.345 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.345 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.345 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.346 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.346 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.346 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.346 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.346 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.346 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.346 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.347 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.347 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.347 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.347 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.347 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.348 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.348 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.348 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.348 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.348 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.348 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.349 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.349 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.349 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.349 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.349 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.349 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.349 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.349 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.350 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.350 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.350 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.350 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.350 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.350 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.350 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.350 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.351 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.351 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.351 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.351 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.351 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.351 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.351 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.352 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.352 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.352 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.352 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.352 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.352 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.352 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.353 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.353 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.353 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.353 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.353 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.353 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.353 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.354 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.354 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.354 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.354 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.354 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.354 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.355 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.355 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.355 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.355 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.355 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.355 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.356 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.356 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.356 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.356 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.356 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.356 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.356 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.357 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.357 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.357 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.357 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.357 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.357 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.357 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.358 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.358 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.358 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.358 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.358 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.358 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.359 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.359 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.359 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.359 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.359 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.359 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.359 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.360 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.360 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.360 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.360 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.360 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.360 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.361 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.361 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.361 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.361 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.361 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.361 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.362 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.362 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.362 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.362 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.362 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.362 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.363 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.363 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.363 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.363 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.363 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.363 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.363 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.364 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.364 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.364 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.364 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.364 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.364 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.364 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.365 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.365 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.365 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.365 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.365 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.365 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.365 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.366 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.366 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.366 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.366 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.366 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.366 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.366 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.367 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.367 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.367 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.367 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.367 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.367 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.367 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.368 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.368 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.368 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.368 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.368 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.368 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.368 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.369 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.369 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.369 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.369 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.369 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.369 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.369 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.370 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.370 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.370 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.370 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.370 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.370 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.370 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.370 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.371 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.371 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.371 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.371 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.371 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.371 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.371 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.372 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.372 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.372 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.372 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.372 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.372 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.372 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.372 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.373 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.373 187082 DEBUG oslo_service.service [None req-ea9956d2-7063-4ffb-a7b4-70422e9dad77 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.373 187082 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.393 187082 INFO nova.virt.node [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Determined node identity ece8f004-1d5b-407f-a713-f9e87706b045 from /var/lib/nova/compute_id
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.394 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.394 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.395 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.395 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.413 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f29d485e2e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.417 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f29d485e2e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.418 187082 INFO nova.virt.libvirt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Connection event '1' reason 'None'
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.425 187082 INFO nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Libvirt host capabilities <capabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]: 
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <host>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <uuid>44bc1d66-b47f-487f-828a-c6d054feec1c</uuid>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <arch>x86_64</arch>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model>EPYC-Rome-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <vendor>AMD</vendor>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <microcode version='16777317'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <signature family='23' model='49' stepping='0'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='x2apic'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='tsc-deadline'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='osxsave'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='hypervisor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='tsc_adjust'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='spec-ctrl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='stibp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='arch-capabilities'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='cmp_legacy'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='topoext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='virt-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='lbrv'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='tsc-scale'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='vmcb-clean'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='pause-filter'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='pfthreshold'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='svme-addr-chk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='rdctl-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='skip-l1dfl-vmentry'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='mds-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature name='pschange-mc-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <pages unit='KiB' size='4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <pages unit='KiB' size='2048'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <pages unit='KiB' size='1048576'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <power_management>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <suspend_mem/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <suspend_disk/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <suspend_hybrid/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </power_management>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <iommu support='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <migration_features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <live/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <uri_transports>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <uri_transport>tcp</uri_transport>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <uri_transport>rdma</uri_transport>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </uri_transports>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </migration_features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <topology>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <cells num='1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <cell id='0'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:           <memory unit='KiB'>7864316</memory>
Nov 24 13:09:58 compute-1 nova_compute[187078]:           <pages unit='KiB' size='4'>1966079</pages>
Nov 24 13:09:58 compute-1 nova_compute[187078]:           <pages unit='KiB' size='2048'>0</pages>
Nov 24 13:09:58 compute-1 nova_compute[187078]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 24 13:09:58 compute-1 nova_compute[187078]:           <distances>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <sibling id='0' value='10'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:           </distances>
Nov 24 13:09:58 compute-1 nova_compute[187078]:           <cpus num='8'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:           </cpus>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         </cell>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </cells>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </topology>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <cache>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </cache>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <secmodel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model>selinux</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <doi>0</doi>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </secmodel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <secmodel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model>dac</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <doi>0</doi>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </secmodel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </host>
Nov 24 13:09:58 compute-1 nova_compute[187078]: 
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <guest>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <os_type>hvm</os_type>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <arch name='i686'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <wordsize>32</wordsize>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <domain type='qemu'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <domain type='kvm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </arch>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <pae/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <nonpae/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <acpi default='on' toggle='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <apic default='on' toggle='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <cpuselection/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <deviceboot/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <disksnapshot default='on' toggle='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <externalSnapshot/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </guest>
Nov 24 13:09:58 compute-1 nova_compute[187078]: 
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <guest>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <os_type>hvm</os_type>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <arch name='x86_64'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <wordsize>64</wordsize>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <domain type='qemu'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <domain type='kvm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </arch>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <acpi default='on' toggle='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <apic default='on' toggle='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <cpuselection/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <deviceboot/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <disksnapshot default='on' toggle='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <externalSnapshot/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </guest>
Nov 24 13:09:58 compute-1 nova_compute[187078]: 
Nov 24 13:09:58 compute-1 nova_compute[187078]: </capabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]: 
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.434 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.437 187082 DEBUG nova.virt.libvirt.volume.mount [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.441 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 24 13:09:58 compute-1 nova_compute[187078]: <domainCapabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <domain>kvm</domain>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <arch>i686</arch>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <vcpu max='240'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <iothreads supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <os supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <enum name='firmware'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <loader supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>rom</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pflash</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='readonly'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>yes</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>no</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='secure'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>no</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </loader>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </os>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='host-passthrough' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='hostPassthroughMigratable'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>on</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>off</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='maximum' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='maximumMigratable'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>on</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>off</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='host-model' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <vendor>AMD</vendor>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='x2apic'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='hypervisor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='stibp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='overflow-recov'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='succor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='lbrv'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc-scale'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='flushbyasid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='pause-filter'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='pfthreshold'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='disable' name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='custom' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Dhyana-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Genoa'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='auto-ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='auto-ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-128'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-256'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-512'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v6'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v7'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='KnightsMill'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512er'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512pf'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='KnightsMill-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512er'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512pf'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G4-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tbm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G5-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tbm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SierraForest'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cmpccxadd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SierraForest-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cmpccxadd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='athlon'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='athlon-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='core2duo'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='core2duo-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='coreduo'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='coreduo-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='n270'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='n270-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='phenom'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='phenom-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <memoryBacking supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <enum name='sourceType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>file</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>anonymous</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>memfd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </memoryBacking>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <disk supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='diskDevice'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>disk</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>cdrom</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>floppy</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>lun</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='bus'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ide</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>fdc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>scsi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>sata</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-non-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <graphics supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vnc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>egl-headless</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dbus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </graphics>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <video supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='modelType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vga</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>cirrus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>none</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>bochs</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ramfb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </video>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <hostdev supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='mode'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>subsystem</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='startupPolicy'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>default</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>mandatory</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>requisite</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>optional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='subsysType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pci</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>scsi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='capsType'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='pciBackend'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </hostdev>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <rng supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-non-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>random</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>egd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>builtin</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <filesystem supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='driverType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>path</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>handle</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtiofs</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </filesystem>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <tpm supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tpm-tis</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tpm-crb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>emulator</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>external</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendVersion'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>2.0</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </tpm>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <redirdev supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='bus'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </redirdev>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <channel supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pty</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>unix</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </channel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <crypto supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>qemu</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>builtin</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </crypto>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <interface supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>default</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>passt</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <panic supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>isa</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>hyperv</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </panic>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <console supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>null</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pty</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dev</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>file</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pipe</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>stdio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>udp</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tcp</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>unix</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>qemu-vdagent</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dbus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </console>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <gic supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <vmcoreinfo supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <genid supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <backingStoreInput supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <backup supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <async-teardown supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <ps2 supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <sev supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <sgx supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <hyperv supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='features'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>relaxed</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vapic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>spinlocks</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vpindex</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>runtime</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>synic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>stimer</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>reset</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vendor_id</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>frequencies</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>reenlightenment</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tlbflush</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ipi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>avic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>emsr_bitmap</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>xmm_input</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <defaults>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <spinlocks>4095</spinlocks>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <stimer_direct>on</stimer_direct>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </defaults>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </hyperv>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <launchSecurity supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='sectype'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tdx</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </launchSecurity>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </features>
Nov 24 13:09:58 compute-1 nova_compute[187078]: </domainCapabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.452 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 24 13:09:58 compute-1 nova_compute[187078]: <domainCapabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <domain>kvm</domain>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <arch>i686</arch>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <vcpu max='4096'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <iothreads supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <os supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <enum name='firmware'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <loader supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>rom</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pflash</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='readonly'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>yes</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>no</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='secure'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>no</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </loader>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </os>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='host-passthrough' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='hostPassthroughMigratable'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>on</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>off</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='maximum' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='maximumMigratable'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>on</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>off</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='host-model' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <vendor>AMD</vendor>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='x2apic'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='hypervisor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='stibp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='overflow-recov'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='succor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='lbrv'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc-scale'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='flushbyasid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='pause-filter'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='pfthreshold'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='disable' name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='custom' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Dhyana-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Genoa'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='auto-ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='auto-ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-128'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-256'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-512'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v6'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v7'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='KnightsMill'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512er'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512pf'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='KnightsMill-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512er'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512pf'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G4-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tbm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G5-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tbm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SierraForest'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cmpccxadd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SierraForest-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cmpccxadd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='athlon'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='athlon-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='core2duo'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='core2duo-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='coreduo'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='coreduo-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='n270'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='n270-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='phenom'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='phenom-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <memoryBacking supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <enum name='sourceType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>file</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>anonymous</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>memfd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </memoryBacking>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <disk supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='diskDevice'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>disk</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>cdrom</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>floppy</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>lun</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='bus'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>fdc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>scsi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>sata</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-non-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <graphics supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vnc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>egl-headless</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dbus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </graphics>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <video supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='modelType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vga</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>cirrus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>none</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>bochs</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ramfb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </video>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <hostdev supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='mode'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>subsystem</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='startupPolicy'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>default</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>mandatory</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>requisite</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>optional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='subsysType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pci</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>scsi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='capsType'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='pciBackend'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </hostdev>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <rng supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-non-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>random</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>egd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>builtin</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <filesystem supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='driverType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>path</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>handle</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtiofs</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </filesystem>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <tpm supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tpm-tis</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tpm-crb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>emulator</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>external</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendVersion'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>2.0</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </tpm>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <redirdev supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='bus'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </redirdev>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <channel supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pty</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>unix</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </channel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <crypto supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>qemu</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>builtin</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </crypto>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <interface supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>default</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>passt</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <panic supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>isa</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>hyperv</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </panic>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <console supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>null</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pty</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dev</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>file</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pipe</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>stdio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>udp</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tcp</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>unix</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>qemu-vdagent</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dbus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </console>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <gic supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <vmcoreinfo supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <genid supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <backingStoreInput supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <backup supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <async-teardown supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <ps2 supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <sev supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <sgx supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <hyperv supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='features'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>relaxed</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vapic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>spinlocks</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vpindex</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>runtime</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>synic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>stimer</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>reset</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vendor_id</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>frequencies</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>reenlightenment</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tlbflush</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ipi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>avic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>emsr_bitmap</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>xmm_input</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <defaults>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <spinlocks>4095</spinlocks>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <stimer_direct>on</stimer_direct>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </defaults>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </hyperv>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <launchSecurity supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='sectype'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tdx</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </launchSecurity>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </features>
Nov 24 13:09:58 compute-1 nova_compute[187078]: </domainCapabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.478 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.482 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 24 13:09:58 compute-1 nova_compute[187078]: <domainCapabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <domain>kvm</domain>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <arch>x86_64</arch>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <vcpu max='240'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <iothreads supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <os supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <enum name='firmware'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <loader supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>rom</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pflash</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='readonly'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>yes</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>no</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='secure'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>no</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </loader>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </os>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='host-passthrough' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='hostPassthroughMigratable'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>on</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>off</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='maximum' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='maximumMigratable'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>on</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>off</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='host-model' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <vendor>AMD</vendor>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='x2apic'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='hypervisor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='stibp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='overflow-recov'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='succor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='lbrv'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc-scale'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='flushbyasid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='pause-filter'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='pfthreshold'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='disable' name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='custom' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Dhyana-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Genoa'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='auto-ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='auto-ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-128'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-256'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-512'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v6'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v7'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='KnightsMill'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512er'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512pf'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='KnightsMill-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512er'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512pf'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G4-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tbm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G5-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tbm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SierraForest'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cmpccxadd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SierraForest-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cmpccxadd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='athlon'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='athlon-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='core2duo'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='core2duo-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='coreduo'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='coreduo-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='n270'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='n270-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='phenom'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='phenom-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <memoryBacking supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <enum name='sourceType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>file</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>anonymous</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>memfd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </memoryBacking>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <disk supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='diskDevice'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>disk</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>cdrom</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>floppy</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>lun</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='bus'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ide</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>fdc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>scsi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>sata</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-non-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <graphics supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vnc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>egl-headless</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dbus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </graphics>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <video supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='modelType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vga</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>cirrus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>none</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>bochs</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ramfb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </video>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <hostdev supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='mode'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>subsystem</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='startupPolicy'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>default</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>mandatory</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>requisite</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>optional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='subsysType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pci</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>scsi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='capsType'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='pciBackend'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </hostdev>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <rng supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-non-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>random</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>egd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>builtin</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <filesystem supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='driverType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>path</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>handle</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtiofs</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </filesystem>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <tpm supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tpm-tis</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tpm-crb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>emulator</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>external</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendVersion'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>2.0</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </tpm>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <redirdev supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='bus'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </redirdev>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <channel supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pty</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>unix</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </channel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <crypto supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>qemu</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>builtin</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </crypto>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <interface supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>default</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>passt</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <panic supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>isa</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>hyperv</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </panic>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <console supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>null</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pty</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dev</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>file</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pipe</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>stdio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>udp</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tcp</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>unix</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>qemu-vdagent</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dbus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </console>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <gic supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <vmcoreinfo supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <genid supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <backingStoreInput supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <backup supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <async-teardown supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <ps2 supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <sev supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <sgx supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <hyperv supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='features'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>relaxed</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vapic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>spinlocks</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vpindex</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>runtime</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>synic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>stimer</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>reset</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vendor_id</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>frequencies</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>reenlightenment</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tlbflush</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ipi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>avic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>emsr_bitmap</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>xmm_input</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <defaults>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <spinlocks>4095</spinlocks>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <stimer_direct>on</stimer_direct>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </defaults>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </hyperv>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <launchSecurity supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='sectype'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tdx</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </launchSecurity>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </features>
Nov 24 13:09:58 compute-1 nova_compute[187078]: </domainCapabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.583 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 24 13:09:58 compute-1 nova_compute[187078]: <domainCapabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <domain>kvm</domain>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <arch>x86_64</arch>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <vcpu max='4096'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <iothreads supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <os supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <enum name='firmware'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>efi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <loader supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>rom</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pflash</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='readonly'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>yes</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>no</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='secure'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>yes</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>no</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </loader>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </os>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='host-passthrough' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='hostPassthroughMigratable'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>on</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>off</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='maximum' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='maximumMigratable'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>on</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>off</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='host-model' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <vendor>AMD</vendor>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='x2apic'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='hypervisor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='stibp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='overflow-recov'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='succor'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='lbrv'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='tsc-scale'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='flushbyasid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='pause-filter'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='pfthreshold'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <feature policy='disable' name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <mode name='custom' supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Broadwell-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Cooperlake-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Denverton-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Dhyana-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Genoa'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='auto-ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='auto-ibrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Milan-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amd-psfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='no-nested-data-bp'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='null-sel-clr-base'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='stibp-always-on'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-Rome-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='EPYC-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='GraniteRapids-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-128'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-256'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx10-512'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='prefetchiti'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Haswell-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v6'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Icelake-Server-v7'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='IvyBridge-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='KnightsMill'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512er'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512pf'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='KnightsMill-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4fmaps'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-4vnniw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512er'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512pf'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G4-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tbm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Opteron_G5-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fma4'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tbm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xop'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SapphireRapids-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='amx-tile'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-bf16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-fp16'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512-vpopcntdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bitalg'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vbmi2'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrc'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fzrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='la57'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='taa-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='tsx-ldtrk'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xfd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SierraForest'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cmpccxadd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='SierraForest-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ifma'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-ne-convert'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx-vnni-int8'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='bus-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cmpccxadd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fbsdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='fsrs'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ibrs-all'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mcdt-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pbrsb-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='psdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='sbdr-ssdp-no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='serialize'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vaes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='vpclmulqdq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Client-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='hle'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='rtm'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Skylake-Server-v5'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512bw'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512cd'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512dq'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512f'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='avx512vl'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='invpcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pcid'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='pku'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='mpx'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v2'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v3'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='core-capability'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='split-lock-detect'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='Snowridge-v4'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='cldemote'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='erms'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='gfni'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdir64b'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='movdiri'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='xsaves'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='athlon'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='athlon-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='core2duo'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='core2duo-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='coreduo'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='coreduo-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='n270'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='n270-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='ss'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='phenom'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <blockers model='phenom-v1'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnow'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <feature name='3dnowext'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </blockers>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </mode>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <memoryBacking supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <enum name='sourceType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>file</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>anonymous</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <value>memfd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </memoryBacking>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <disk supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='diskDevice'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>disk</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>cdrom</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>floppy</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>lun</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='bus'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>fdc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>scsi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>sata</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-non-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <graphics supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vnc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>egl-headless</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dbus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </graphics>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <video supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='modelType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vga</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>cirrus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>none</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>bochs</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ramfb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </video>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <hostdev supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='mode'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>subsystem</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='startupPolicy'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>default</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>mandatory</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>requisite</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>optional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='subsysType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pci</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>scsi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='capsType'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='pciBackend'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </hostdev>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <rng supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtio-non-transitional</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>random</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>egd</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>builtin</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <filesystem supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='driverType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>path</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>handle</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>virtiofs</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </filesystem>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <tpm supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tpm-tis</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tpm-crb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>emulator</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>external</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendVersion'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>2.0</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </tpm>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <redirdev supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='bus'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>usb</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </redirdev>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <channel supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pty</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>unix</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </channel>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <crypto supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>qemu</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendModel'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>builtin</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </crypto>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <interface supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='backendType'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>default</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>passt</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <panic supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='model'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>isa</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>hyperv</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </panic>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <console supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='type'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>null</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vc</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pty</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dev</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>file</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>pipe</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>stdio</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>udp</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tcp</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>unix</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>qemu-vdagent</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>dbus</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </console>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <features>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <gic supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <vmcoreinfo supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <genid supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <backingStoreInput supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <backup supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <async-teardown supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <ps2 supported='yes'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <sev supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <sgx supported='no'/>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <hyperv supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='features'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>relaxed</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vapic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>spinlocks</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vpindex</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>runtime</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>synic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>stimer</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>reset</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>vendor_id</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>frequencies</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>reenlightenment</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tlbflush</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>ipi</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>avic</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>emsr_bitmap</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>xmm_input</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <defaults>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <spinlocks>4095</spinlocks>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <stimer_direct>on</stimer_direct>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </defaults>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </hyperv>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     <launchSecurity supported='yes'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       <enum name='sectype'>
Nov 24 13:09:58 compute-1 nova_compute[187078]:         <value>tdx</value>
Nov 24 13:09:58 compute-1 nova_compute[187078]:       </enum>
Nov 24 13:09:58 compute-1 nova_compute[187078]:     </launchSecurity>
Nov 24 13:09:58 compute-1 nova_compute[187078]:   </features>
Nov 24 13:09:58 compute-1 nova_compute[187078]: </domainCapabilities>
Nov 24 13:09:58 compute-1 nova_compute[187078]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.649 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.649 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.649 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.649 187082 INFO nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Secure Boot support detected
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.651 187082 INFO nova.virt.libvirt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.651 187082 INFO nova.virt.libvirt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.660 187082 DEBUG nova.virt.libvirt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] cpu compare xml: <cpu match="exact">
Nov 24 13:09:58 compute-1 nova_compute[187078]:   <model>Nehalem</model>
Nov 24 13:09:58 compute-1 nova_compute[187078]: </cpu>
Nov 24 13:09:58 compute-1 nova_compute[187078]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.662 187082 DEBUG nova.virt.libvirt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.685 187082 INFO nova.virt.node [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Determined node identity ece8f004-1d5b-407f-a713-f9e87706b045 from /var/lib/nova/compute_id
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.705 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Verified node ece8f004-1d5b-407f-a713-f9e87706b045 matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.732 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.823 187082 DEBUG oslo_concurrency.lockutils [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.823 187082 DEBUG oslo_concurrency.lockutils [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.823 187082 DEBUG oslo_concurrency.lockutils [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.823 187082 DEBUG nova.compute.resource_tracker [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.962 187082 WARNING nova.virt.libvirt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.963 187082 DEBUG nova.compute.resource_tracker [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6206MB free_disk=73.66716384887695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.963 187082 DEBUG oslo_concurrency.lockutils [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:09:58 compute-1 nova_compute[187078]: 2025-11-24 13:09:58.963 187082 DEBUG oslo_concurrency.lockutils [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.063 187082 DEBUG nova.compute.resource_tracker [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.063 187082 DEBUG nova.compute.resource_tracker [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.132 187082 DEBUG nova.scheduler.client.report [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.225 187082 DEBUG nova.scheduler.client.report [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.226 187082 DEBUG nova.compute.provider_tree [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.263 187082 DEBUG nova.scheduler.client.report [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.289 187082 DEBUG nova.scheduler.client.report [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.309 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 24 13:09:59 compute-1 nova_compute[187078]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.310 187082 INFO nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] kernel doesn't support AMD SEV
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.311 187082 DEBUG nova.compute.provider_tree [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.312 187082 DEBUG nova.virt.libvirt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.316 187082 DEBUG nova.virt.libvirt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Libvirt baseline CPU <cpu>
Nov 24 13:09:59 compute-1 nova_compute[187078]:   <arch>x86_64</arch>
Nov 24 13:09:59 compute-1 nova_compute[187078]:   <model>Nehalem</model>
Nov 24 13:09:59 compute-1 nova_compute[187078]:   <vendor>AMD</vendor>
Nov 24 13:09:59 compute-1 nova_compute[187078]:   <topology sockets="8" cores="1" threads="1"/>
Nov 24 13:09:59 compute-1 nova_compute[187078]: </cpu>
Nov 24 13:09:59 compute-1 nova_compute[187078]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.339 187082 DEBUG nova.scheduler.client.report [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.363 187082 DEBUG nova.compute.resource_tracker [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.364 187082 DEBUG oslo_concurrency.lockutils [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.364 187082 DEBUG nova.service [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.422 187082 DEBUG nova.service [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 24 13:09:59 compute-1 nova_compute[187078]: 2025-11-24 13:09:59.422 187082 DEBUG nova.servicegroup.drivers.db [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 24 13:10:02 compute-1 sshd-session[187384]: Accepted publickey for zuul from 192.168.122.30 port 50190 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:10:02 compute-1 systemd-logind[815]: New session 27 of user zuul.
Nov 24 13:10:02 compute-1 systemd[1]: Started Session 27 of User zuul.
Nov 24 13:10:02 compute-1 sshd-session[187384]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:10:03 compute-1 python3.9[187537]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:10:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:10:04.136 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:10:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:10:04.137 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:10:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:10:04.137 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:10:04 compute-1 sshd-session[187566]: Invalid user ir from 85.209.134.43 port 45448
Nov 24 13:10:04 compute-1 sshd-session[187566]: Received disconnect from 85.209.134.43 port 45448:11: Bye Bye [preauth]
Nov 24 13:10:04 compute-1 sshd-session[187566]: Disconnected from invalid user ir 85.209.134.43 port 45448 [preauth]
Nov 24 13:10:05 compute-1 sudo[187693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpowkgfyvlejkmugiddpgvteezzzqqfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989804.4974668-53-48267453961037/AnsiballZ_systemd_service.py'
Nov 24 13:10:05 compute-1 sudo[187693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:05 compute-1 python3.9[187695]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:10:05 compute-1 systemd[1]: Reloading.
Nov 24 13:10:05 compute-1 systemd-rc-local-generator[187722]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:10:05 compute-1 systemd-sysv-generator[187725]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:10:05 compute-1 sudo[187693]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:06 compute-1 python3.9[187879]: ansible-ansible.builtin.service_facts Invoked
Nov 24 13:10:06 compute-1 network[187896]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 13:10:06 compute-1 network[187897]: 'network-scripts' will be removed from distribution in near future.
Nov 24 13:10:06 compute-1 network[187898]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 13:10:09 compute-1 nova_compute[187078]: 2025-11-24 13:10:09.425 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:09 compute-1 nova_compute[187078]: 2025-11-24 13:10:09.441 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:09 compute-1 podman[188014]: 2025-11-24 13:10:09.701591248 +0000 UTC m=+0.074791922 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 24 13:10:10 compute-1 sshd-session[187383]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:10:10 compute-1 sshd-session[187383]: banner exchange: Connection from 218.56.160.82 port 18866: Connection timed out
Nov 24 13:10:11 compute-1 sudo[188188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swwofeqnwvxaisiqcnomenofgnrsckdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989811.2230632-91-71854238810804/AnsiballZ_systemd_service.py'
Nov 24 13:10:11 compute-1 sudo[188188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:11 compute-1 python3.9[188190]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:10:11 compute-1 sudo[188188]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:12 compute-1 sudo[188341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mptdjasjmtuxvqbsvzczelabtssvrmse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989812.3183095-111-200824397909286/AnsiballZ_file.py'
Nov 24 13:10:12 compute-1 sudo[188341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:12 compute-1 sshd-session[188136]: Invalid user redmine from 175.100.24.139 port 34358
Nov 24 13:10:12 compute-1 python3.9[188343]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:12 compute-1 sudo[188341]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:12 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 13:10:12 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 13:10:13 compute-1 sshd-session[188136]: Received disconnect from 175.100.24.139 port 34358:11: Bye Bye [preauth]
Nov 24 13:10:13 compute-1 sshd-session[188136]: Disconnected from invalid user redmine 175.100.24.139 port 34358 [preauth]
Nov 24 13:10:13 compute-1 sudo[188524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edlekaseaylopppzcljmptmhyrniudzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989813.189871-127-33184777869298/AnsiballZ_file.py'
Nov 24 13:10:13 compute-1 sudo[188524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:13 compute-1 podman[188468]: 2025-11-24 13:10:13.471677756 +0000 UTC m=+0.059917472 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:10:13 compute-1 podman[188469]: 2025-11-24 13:10:13.519177055 +0000 UTC m=+0.095460382 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:10:13 compute-1 python3.9[188532]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:13 compute-1 sudo[188524]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:14 compute-1 sudo[188692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gknepoubkvnncymibhsineygxngjqiuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989813.9619317-145-52732786478198/AnsiballZ_command.py'
Nov 24 13:10:14 compute-1 sudo[188692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:14 compute-1 python3.9[188694]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:10:14 compute-1 sudo[188692]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:15 compute-1 python3.9[188846]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 13:10:16 compute-1 sudo[188996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unumlwlrubtfqrqbczvvynsvgawhqvky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989815.8182013-181-258316210232058/AnsiballZ_systemd_service.py'
Nov 24 13:10:16 compute-1 sudo[188996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:16 compute-1 python3.9[188998]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:10:16 compute-1 systemd[1]: Reloading.
Nov 24 13:10:16 compute-1 systemd-rc-local-generator[189023]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:10:16 compute-1 systemd-sysv-generator[189028]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:10:16 compute-1 sudo[188996]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:17 compute-1 sudo[189183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtezcbqqgofxxjezqkwlybnakzcihuyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989817.027544-197-247679698234479/AnsiballZ_command.py'
Nov 24 13:10:17 compute-1 sudo[189183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:17 compute-1 python3.9[189185]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:10:17 compute-1 sudo[189183]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:18 compute-1 sudo[189336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztarikwiidhslbxuezgslczlpmdtwlla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989817.850506-215-151490722683161/AnsiballZ_file.py'
Nov 24 13:10:18 compute-1 sudo[189336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:18 compute-1 python3.9[189338]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:10:18 compute-1 sudo[189336]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:19 compute-1 python3.9[189488]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:10:20 compute-1 python3.9[189640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:20 compute-1 python3.9[189762]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989819.657672-247-57941693189871/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:10:21 compute-1 sudo[189912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmnbyrpmgthtvtshcrufjaemrqkuioyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989821.1906612-277-37222530731395/AnsiballZ_group.py'
Nov 24 13:10:21 compute-1 sudo[189912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:21 compute-1 python3.9[189914]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 24 13:10:21 compute-1 sudo[189912]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:22 compute-1 sudo[190064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkauamkbocvomczuorlkkmycusiwhkcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989822.396717-299-149386747230009/AnsiballZ_getent.py'
Nov 24 13:10:22 compute-1 sudo[190064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:23 compute-1 python3.9[190066]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 24 13:10:23 compute-1 sudo[190064]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:23 compute-1 sudo[190217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loneiksagtoqaqttfycndpouaazuwywt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989823.4704895-315-233134223403589/AnsiballZ_group.py'
Nov 24 13:10:23 compute-1 sudo[190217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:23 compute-1 python3.9[190219]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 13:10:23 compute-1 groupadd[190220]: group added to /etc/group: name=ceilometer, GID=42405
Nov 24 13:10:23 compute-1 groupadd[190220]: group added to /etc/gshadow: name=ceilometer
Nov 24 13:10:23 compute-1 groupadd[190220]: new group: name=ceilometer, GID=42405
Nov 24 13:10:24 compute-1 sudo[190217]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:24 compute-1 sudo[190375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sutxhpkrplxfupzmadcecqrcqitxguuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989824.2666442-331-41007648226122/AnsiballZ_user.py'
Nov 24 13:10:24 compute-1 sudo[190375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:25 compute-1 python3.9[190377]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 13:10:25 compute-1 useradd[190379]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 13:10:25 compute-1 useradd[190379]: add 'ceilometer' to group 'libvirt'
Nov 24 13:10:25 compute-1 useradd[190379]: add 'ceilometer' to shadow group 'libvirt'
Nov 24 13:10:25 compute-1 sudo[190375]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:26 compute-1 python3.9[190535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:26 compute-1 python3.9[190656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763989826.0165706-383-91508954044871/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:27 compute-1 python3.9[190806]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:28 compute-1 python3.9[190927]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763989827.184183-383-109880097162340/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:28 compute-1 python3.9[191077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:29 compute-1 python3.9[191198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763989828.4306774-383-111504494431524/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:30 compute-1 python3.9[191348]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:10:30 compute-1 sshd-session[189688]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:10:30 compute-1 sshd-session[189688]: banner exchange: Connection from 218.56.160.82 port 41620: Connection timed out
Nov 24 13:10:30 compute-1 python3.9[191500]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:10:31 compute-1 python3.9[191654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:31 compute-1 sshd-session[191501]: Invalid user testadmin from 5.198.176.28 port 42758
Nov 24 13:10:31 compute-1 sshd-session[191501]: Received disconnect from 5.198.176.28 port 42758:11: Bye Bye [preauth]
Nov 24 13:10:31 compute-1 sshd-session[191501]: Disconnected from invalid user testadmin 5.198.176.28 port 42758 [preauth]
Nov 24 13:10:32 compute-1 python3.9[191775]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989831.166443-501-111979541021390/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:33 compute-1 python3.9[191925]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:33 compute-1 python3.9[192001]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:34 compute-1 python3.9[192151]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:35 compute-1 python3.9[192272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989834.0595324-501-178393402035068/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:35 compute-1 python3.9[192422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:36 compute-1 python3.9[192543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989835.416088-501-264402344011744/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:37 compute-1 python3.9[192693]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:37 compute-1 python3.9[192814]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989836.6818516-501-19410675464102/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:38 compute-1 python3.9[192965]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:38 compute-1 python3.9[193086]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989837.9341106-501-179415754138123/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:39 compute-1 python3.9[193236]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:39 compute-1 podman[193331]: 2025-11-24 13:10:39.907919089 +0000 UTC m=+0.049661316 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 24 13:10:40 compute-1 python3.9[193369]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989839.0808666-501-153395920912182/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:40 compute-1 python3.9[193525]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:41 compute-1 python3.9[193646]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989840.2496052-501-12029559483160/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:41 compute-1 python3.9[193796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:42 compute-1 python3.9[193917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989841.4618204-501-64515393885721/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:43 compute-1 python3.9[194067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:43 compute-1 podman[194162]: 2025-11-24 13:10:43.562237783 +0000 UTC m=+0.053992352 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 13:10:43 compute-1 podman[194209]: 2025-11-24 13:10:43.707275424 +0000 UTC m=+0.108884707 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 24 13:10:43 compute-1 python3.9[194205]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989842.6670792-501-132152123196048/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:44 compute-1 python3.9[194384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:45 compute-1 python3.9[194505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989843.9020357-501-28226009392601/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:46 compute-1 python3.9[194655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:47 compute-1 python3.9[194731]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:47 compute-1 python3.9[194881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:48 compute-1 sshd-session[192839]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:10:48 compute-1 sshd-session[192839]: banner exchange: Connection from 218.56.160.82 port 16665: Connection timed out
Nov 24 13:10:48 compute-1 python3.9[194957]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:49 compute-1 python3.9[195109]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:49 compute-1 sshd-session[195034]: Invalid user redmine from 176.114.89.34 port 44292
Nov 24 13:10:50 compute-1 sshd-session[195034]: Received disconnect from 176.114.89.34 port 44292:11: Bye Bye [preauth]
Nov 24 13:10:50 compute-1 sshd-session[195034]: Disconnected from invalid user redmine 176.114.89.34 port 44292 [preauth]
Nov 24 13:10:50 compute-1 python3.9[195185]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:50 compute-1 sudo[195335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljqibftzbaqgpvcieifmorhxnqbrsqxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989850.345987-879-190152609576771/AnsiballZ_file.py'
Nov 24 13:10:50 compute-1 sudo[195335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:50 compute-1 python3.9[195337]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:50 compute-1 sudo[195335]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:51 compute-1 sshd-session[195338]: Invalid user sol from 193.32.162.145 port 43772
Nov 24 13:10:51 compute-1 sshd-session[195338]: Connection closed by invalid user sol 193.32.162.145 port 43772 [preauth]
Nov 24 13:10:51 compute-1 sudo[195489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxtdhoufgjuveveenorhoauojxtqxbrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989851.2096956-895-226790786620978/AnsiballZ_file.py'
Nov 24 13:10:51 compute-1 sudo[195489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:51 compute-1 python3.9[195491]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:10:51 compute-1 sudo[195489]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:52 compute-1 sudo[195641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zagmbwmpjrvzsdevljbukjkfxejnarxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989852.1065114-911-231622589154513/AnsiballZ_file.py'
Nov 24 13:10:52 compute-1 sudo[195641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:52 compute-1 python3.9[195643]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:10:52 compute-1 sudo[195641]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:53 compute-1 sudo[195793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eydlryevlburgslcscntxbyckfszriir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989852.8901482-927-46072777121144/AnsiballZ_systemd_service.py'
Nov 24 13:10:53 compute-1 sudo[195793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:53 compute-1 python3.9[195795]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:10:53 compute-1 systemd[1]: Reloading.
Nov 24 13:10:53 compute-1 systemd-rc-local-generator[195824]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:10:53 compute-1 systemd-sysv-generator[195828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:10:54 compute-1 systemd[1]: Listening on Podman API Socket.
Nov 24 13:10:54 compute-1 sudo[195793]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:54 compute-1 sudo[195984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fndfvbmhsasvwqdfhtzijgixzqawmfqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989854.3854547-945-146744654395455/AnsiballZ_stat.py'
Nov 24 13:10:54 compute-1 sudo[195984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:54 compute-1 python3.9[195986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:10:54 compute-1 sudo[195984]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:55 compute-1 sshd-session[196028]: Invalid user sol from 45.148.10.240 port 55842
Nov 24 13:10:55 compute-1 sudo[196110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkooniejwgaofhgxqchczixmszljtpwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989854.3854547-945-146744654395455/AnsiballZ_copy.py'
Nov 24 13:10:55 compute-1 sudo[196110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:55 compute-1 sshd-session[196028]: Connection closed by invalid user sol 45.148.10.240 port 55842 [preauth]
Nov 24 13:10:55 compute-1 python3.9[196112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989854.3854547-945-146744654395455/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:10:55 compute-1 sudo[196110]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:56 compute-1 sudo[196262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ditvrzvbzfyzmfdwpssnnzycwsqdmjtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989856.1743011-979-110840948819551/AnsiballZ_container_config_data.py'
Nov 24 13:10:56 compute-1 sudo[196262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:56 compute-1 python3.9[196264]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 24 13:10:56 compute-1 sudo[196262]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.670 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.670 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.670 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:10:57 compute-1 sudo[196414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djwlagyrmjmwdrfaznirbsezwxjhlhyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989857.241036-997-56588846451904/AnsiballZ_container_config_hash.py'
Nov 24 13:10:57 compute-1 sudo[196414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.681 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.681 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.682 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.682 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.682 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.683 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.683 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.683 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.683 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.701 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.702 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.702 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.702 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.894 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.897 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6157MB free_disk=73.66701126098633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.897 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.898 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:10:57 compute-1 python3.9[196416]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 13:10:57 compute-1 sudo[196414]: pam_unix(sudo:session): session closed for user root
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.967 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.967 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:10:57 compute-1 nova_compute[187078]: 2025-11-24 13:10:57.996 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:10:58 compute-1 nova_compute[187078]: 2025-11-24 13:10:58.010 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:10:58 compute-1 nova_compute[187078]: 2025-11-24 13:10:58.012 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:10:58 compute-1 nova_compute[187078]: 2025-11-24 13:10:58.012 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:10:58 compute-1 sudo[196566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyxfjvvmbehlpjbuyzfeyrwdmyvarspb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989858.3098745-1017-202447860535835/AnsiballZ_edpm_container_manage.py'
Nov 24 13:10:58 compute-1 sudo[196566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:10:59 compute-1 python3[196568]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 13:11:00 compute-1 podman[196580]: 2025-11-24 13:11:00.397875574 +0000 UTC m=+1.262425480 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 24 13:11:00 compute-1 podman[196676]: 2025-11-24 13:11:00.566001043 +0000 UTC m=+0.063189617 container create 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:11:00 compute-1 podman[196676]: 2025-11-24 13:11:00.531665491 +0000 UTC m=+0.028854085 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 24 13:11:00 compute-1 python3[196568]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 24 13:11:00 compute-1 sudo[196566]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:01 compute-1 sudo[196865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edefcaxbuyhvdoierxkinrwihuxyeyed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989861.1625488-1033-232557718970968/AnsiballZ_stat.py'
Nov 24 13:11:01 compute-1 sudo[196865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:01 compute-1 python3.9[196867]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:11:01 compute-1 sudo[196865]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:01 compute-1 sshd-session[196894]: Invalid user mike from 85.209.134.43 port 49578
Nov 24 13:11:02 compute-1 sshd-session[196894]: Received disconnect from 85.209.134.43 port 49578:11: Bye Bye [preauth]
Nov 24 13:11:02 compute-1 sshd-session[196894]: Disconnected from invalid user mike 85.209.134.43 port 49578 [preauth]
Nov 24 13:11:02 compute-1 sudo[197021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujqwvdsalfxvkpaqeenrhrxnrxaopqpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989862.0031924-1051-73109143980238/AnsiballZ_file.py'
Nov 24 13:11:02 compute-1 sudo[197021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:02 compute-1 python3.9[197023]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:02 compute-1 sshd[128576]: Timeout before authentication for connection from 45.78.217.131 to 38.102.83.173, pid = 178410
Nov 24 13:11:02 compute-1 sudo[197021]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:03 compute-1 sudo[197174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hworrghsttojkmacxurdszynryqdvhnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989862.6313612-1051-34417505619664/AnsiballZ_copy.py'
Nov 24 13:11:03 compute-1 sudo[197174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:03 compute-1 python3.9[197176]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763989862.6313612-1051-34417505619664/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:03 compute-1 sudo[197174]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:03 compute-1 sudo[197250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odssnergmnoecdygczanqzrczdcvggun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989862.6313612-1051-34417505619664/AnsiballZ_systemd.py'
Nov 24 13:11:03 compute-1 sudo[197250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:04 compute-1 python3.9[197252]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:11:04 compute-1 systemd[1]: Reloading.
Nov 24 13:11:04 compute-1 systemd-sysv-generator[197278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:11:04 compute-1 systemd-rc-local-generator[197274]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:11:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:11:04.136 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:11:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:11:04.137 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:11:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:11:04.137 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:11:04 compute-1 sshd-session[197122]: Invalid user chris from 68.183.82.237 port 48414
Nov 24 13:11:04 compute-1 sudo[197250]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:04 compute-1 sshd-session[197122]: Received disconnect from 68.183.82.237 port 48414:11: Bye Bye [preauth]
Nov 24 13:11:04 compute-1 sshd-session[197122]: Disconnected from invalid user chris 68.183.82.237 port 48414 [preauth]
Nov 24 13:11:04 compute-1 sudo[197361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdmkhqaslvgygyexusnrviakpcdcxicv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989862.6313612-1051-34417505619664/AnsiballZ_systemd.py'
Nov 24 13:11:04 compute-1 sudo[197361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:04 compute-1 python3.9[197363]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:11:04 compute-1 systemd[1]: Reloading.
Nov 24 13:11:05 compute-1 systemd-sysv-generator[197395]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:11:05 compute-1 systemd-rc-local-generator[197389]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:11:05 compute-1 systemd[1]: Starting podman_exporter container...
Nov 24 13:11:05 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:11:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19eed9efa6aa440c60046be6101aaa0188c82f5cdb5aa400ff06364b2d8bd976/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19eed9efa6aa440c60046be6101aaa0188c82f5cdb5aa400ff06364b2d8bd976/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:05 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b.
Nov 24 13:11:05 compute-1 podman[197403]: 2025-11-24 13:11:05.50849977 +0000 UTC m=+0.141083889 container init 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:11:05 compute-1 podman_exporter[197418]: ts=2025-11-24T13:11:05.525Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 24 13:11:05 compute-1 podman_exporter[197418]: ts=2025-11-24T13:11:05.525Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 24 13:11:05 compute-1 podman_exporter[197418]: ts=2025-11-24T13:11:05.525Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 24 13:11:05 compute-1 podman_exporter[197418]: ts=2025-11-24T13:11:05.525Z caller=handler.go:105 level=info collector=container
Nov 24 13:11:05 compute-1 podman[197403]: 2025-11-24 13:11:05.540647909 +0000 UTC m=+0.173231988 container start 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:11:05 compute-1 systemd[1]: Starting Podman API Service...
Nov 24 13:11:05 compute-1 systemd[1]: Started Podman API Service.
Nov 24 13:11:05 compute-1 podman[197403]: podman_exporter
Nov 24 13:11:05 compute-1 systemd[1]: Started podman_exporter container.
Nov 24 13:11:05 compute-1 podman[197429]: time="2025-11-24T13:11:05Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 24 13:11:05 compute-1 podman[197429]: time="2025-11-24T13:11:05Z" level=info msg="Setting parallel job count to 25"
Nov 24 13:11:05 compute-1 podman[197429]: time="2025-11-24T13:11:05Z" level=info msg="Using sqlite as database backend"
Nov 24 13:11:05 compute-1 podman[197429]: time="2025-11-24T13:11:05Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 24 13:11:05 compute-1 podman[197429]: time="2025-11-24T13:11:05Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 24 13:11:05 compute-1 podman[197429]: time="2025-11-24T13:11:05Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 24 13:11:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:11:05 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 24 13:11:05 compute-1 podman[197429]: time="2025-11-24T13:11:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:11:05 compute-1 sudo[197361]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:05 compute-1 podman[197427]: 2025-11-24 13:11:05.601266022 +0000 UTC m=+0.053185039 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:11:05 compute-1 systemd[1]: 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b-f13253d98e97b0b.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 13:11:05 compute-1 systemd[1]: 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b-f13253d98e97b0b.service: Failed with result 'exit-code'.
Nov 24 13:11:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:11:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14064 "" "Go-http-client/1.1"
Nov 24 13:11:05 compute-1 podman_exporter[197418]: ts=2025-11-24T13:11:05.615Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 24 13:11:05 compute-1 podman_exporter[197418]: ts=2025-11-24T13:11:05.616Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 24 13:11:05 compute-1 podman_exporter[197418]: ts=2025-11-24T13:11:05.616Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 24 13:11:05 compute-1 sshd-session[196060]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:11:05 compute-1 sshd-session[196060]: banner exchange: Connection from 218.56.160.82 port 15461: Connection timed out
Nov 24 13:11:06 compute-1 sudo[197611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duykjwrlupykhrcaqfoploptxwkdyzmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989866.1772597-1099-242506397030997/AnsiballZ_systemd.py'
Nov 24 13:11:06 compute-1 sudo[197611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:06 compute-1 python3.9[197613]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:11:06 compute-1 systemd[1]: Stopping podman_exporter container...
Nov 24 13:11:06 compute-1 podman[197429]: @ - - [24/Nov/2025:13:11:05 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Nov 24 13:11:06 compute-1 systemd[1]: libpod-72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b.scope: Deactivated successfully.
Nov 24 13:11:06 compute-1 podman[197617]: 2025-11-24 13:11:06.892521144 +0000 UTC m=+0.066654848 container died 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:11:06 compute-1 systemd[1]: 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b-f13253d98e97b0b.timer: Deactivated successfully.
Nov 24 13:11:06 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b.
Nov 24 13:11:06 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b-userdata-shm.mount: Deactivated successfully.
Nov 24 13:11:06 compute-1 systemd[1]: var-lib-containers-storage-overlay-19eed9efa6aa440c60046be6101aaa0188c82f5cdb5aa400ff06364b2d8bd976-merged.mount: Deactivated successfully.
Nov 24 13:11:07 compute-1 podman[197617]: 2025-11-24 13:11:07.126095575 +0000 UTC m=+0.300229269 container cleanup 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:11:07 compute-1 podman[197617]: podman_exporter
Nov 24 13:11:07 compute-1 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 24 13:11:07 compute-1 podman[197646]: podman_exporter
Nov 24 13:11:07 compute-1 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 24 13:11:07 compute-1 systemd[1]: Stopped podman_exporter container.
Nov 24 13:11:07 compute-1 systemd[1]: Starting podman_exporter container...
Nov 24 13:11:07 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:11:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19eed9efa6aa440c60046be6101aaa0188c82f5cdb5aa400ff06364b2d8bd976/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19eed9efa6aa440c60046be6101aaa0188c82f5cdb5aa400ff06364b2d8bd976/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:07 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b.
Nov 24 13:11:07 compute-1 podman[197659]: 2025-11-24 13:11:07.331919824 +0000 UTC m=+0.120357420 container init 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:11:07 compute-1 podman_exporter[197674]: ts=2025-11-24T13:11:07.346Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 24 13:11:07 compute-1 podman_exporter[197674]: ts=2025-11-24T13:11:07.346Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 24 13:11:07 compute-1 podman_exporter[197674]: ts=2025-11-24T13:11:07.346Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 24 13:11:07 compute-1 podman_exporter[197674]: ts=2025-11-24T13:11:07.346Z caller=handler.go:105 level=info collector=container
Nov 24 13:11:07 compute-1 podman[197429]: @ - - [24/Nov/2025:13:11:07 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 24 13:11:07 compute-1 podman[197429]: time="2025-11-24T13:11:07Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:11:07 compute-1 podman[197659]: 2025-11-24 13:11:07.360031647 +0000 UTC m=+0.148469213 container start 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:11:07 compute-1 podman[197659]: podman_exporter
Nov 24 13:11:07 compute-1 podman[197429]: @ - - [24/Nov/2025:13:11:07 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14066 "" "Go-http-client/1.1"
Nov 24 13:11:07 compute-1 podman_exporter[197674]: ts=2025-11-24T13:11:07.367Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 24 13:11:07 compute-1 podman_exporter[197674]: ts=2025-11-24T13:11:07.367Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 24 13:11:07 compute-1 podman_exporter[197674]: ts=2025-11-24T13:11:07.368Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 24 13:11:07 compute-1 systemd[1]: Started podman_exporter container.
Nov 24 13:11:07 compute-1 sudo[197611]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:07 compute-1 podman[197683]: 2025-11-24 13:11:07.420796152 +0000 UTC m=+0.050077338 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:11:08 compute-1 sudo[197857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnzwukscikacdwkmsrwwipyqjwgzklro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989868.5909874-1115-270653581293771/AnsiballZ_stat.py'
Nov 24 13:11:08 compute-1 sudo[197857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:09 compute-1 python3.9[197859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:11:09 compute-1 sudo[197857]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:09 compute-1 sudo[197980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsfetlkmsinbujyfnecmmvbdazbcqusl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989868.5909874-1115-270653581293771/AnsiballZ_copy.py'
Nov 24 13:11:09 compute-1 sudo[197980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:09 compute-1 python3.9[197982]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763989868.5909874-1115-270653581293771/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 13:11:09 compute-1 sudo[197980]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:09 compute-1 auditd[705]: Audit daemon rotating log files
Nov 24 13:11:10 compute-1 sudo[198149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkujttappmcevacknvilmwkjbzghdxgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989870.2014053-1149-173883847802341/AnsiballZ_container_config_data.py'
Nov 24 13:11:10 compute-1 podman[198106]: 2025-11-24 13:11:10.488235912 +0000 UTC m=+0.051088857 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 13:11:10 compute-1 sudo[198149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:10 compute-1 python3.9[198153]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 24 13:11:10 compute-1 sudo[198149]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:11 compute-1 sudo[198307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qstpfkmwlbforilujvxgdqaqplvkumef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989870.9856453-1167-120670009569766/AnsiballZ_container_config_hash.py'
Nov 24 13:11:11 compute-1 sudo[198307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:11 compute-1 python3.9[198309]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 13:11:11 compute-1 sudo[198307]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:12 compute-1 sudo[198459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwpqshecdoodynsmvlvwfszzixfjwkpn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989871.8944378-1187-173292429522876/AnsiballZ_edpm_container_manage.py'
Nov 24 13:11:12 compute-1 sudo[198459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:12 compute-1 python3[198461]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 13:11:14 compute-1 podman[198516]: 2025-11-24 13:11:14.417518083 +0000 UTC m=+0.709396655 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:11:14 compute-1 podman[198473]: 2025-11-24 13:11:14.669416434 +0000 UTC m=+2.167677855 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 24 13:11:14 compute-1 podman[198549]: 2025-11-24 13:11:14.708750601 +0000 UTC m=+0.268138352 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 13:11:14 compute-1 podman[198613]: 2025-11-24 13:11:14.827032919 +0000 UTC m=+0.046783704 container create 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:11:14 compute-1 podman[198613]: 2025-11-24 13:11:14.798920806 +0000 UTC m=+0.018671591 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 24 13:11:14 compute-1 python3[198461]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 24 13:11:14 compute-1 sudo[198459]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:15 compute-1 sshd-session[198155]: Invalid user frontend from 45.78.194.40 port 50322
Nov 24 13:11:15 compute-1 sudo[198802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hadzfehzyioqlysvdoekhqcsdcikyofh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989875.4315739-1203-142041595073316/AnsiballZ_stat.py'
Nov 24 13:11:15 compute-1 sudo[198802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:15 compute-1 python3.9[198804]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:11:15 compute-1 sudo[198802]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:16 compute-1 sudo[198956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucyuiwxgrnwaunensytudnbivtqatacn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989876.2112854-1221-277692540176619/AnsiballZ_file.py'
Nov 24 13:11:16 compute-1 sudo[198956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:16 compute-1 python3.9[198958]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:16 compute-1 sudo[198956]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:17 compute-1 sudo[199107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyvdeexgdlzyqqazmqzuhbvehfiblmmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989876.77699-1221-151559055027036/AnsiballZ_copy.py'
Nov 24 13:11:17 compute-1 sudo[199107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:17 compute-1 python3.9[199109]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763989876.77699-1221-151559055027036/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:17 compute-1 sudo[199107]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:17 compute-1 sudo[199183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvkazwxsbohrpgpnnywnynzcnrqxaugf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989876.77699-1221-151559055027036/AnsiballZ_systemd.py'
Nov 24 13:11:17 compute-1 sudo[199183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:17 compute-1 python3.9[199185]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:11:17 compute-1 systemd[1]: Reloading.
Nov 24 13:11:18 compute-1 systemd-rc-local-generator[199208]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:11:18 compute-1 systemd-sysv-generator[199212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:11:18 compute-1 sudo[199183]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:18 compute-1 sudo[199295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eahhrfswwlptrcsrtyvbeudftzzuubuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989876.77699-1221-151559055027036/AnsiballZ_systemd.py'
Nov 24 13:11:18 compute-1 sudo[199295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:18 compute-1 python3.9[199297]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 13:11:18 compute-1 systemd[1]: Reloading.
Nov 24 13:11:19 compute-1 systemd-rc-local-generator[199328]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:11:19 compute-1 systemd-sysv-generator[199332]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 13:11:19 compute-1 systemd[1]: Starting openstack_network_exporter container...
Nov 24 13:11:19 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:11:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93d231694f426c4257a7cd44da4062a0eb2e86eda099414aea7fdcf6d58e5e1/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93d231694f426c4257a7cd44da4062a0eb2e86eda099414aea7fdcf6d58e5e1/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93d231694f426c4257a7cd44da4062a0eb2e86eda099414aea7fdcf6d58e5e1/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:19 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b.
Nov 24 13:11:19 compute-1 podman[199337]: 2025-11-24 13:11:19.453889462 +0000 UTC m=+0.150694837 container init 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *bridge.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *coverage.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *datapath.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *iface.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *memory.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *ovnnorthd.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *ovn.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *ovsdbserver.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *pmd_perf.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *pmd_rxq.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: INFO    13:11:19 main.go:48: registering *vswitch.Collector
Nov 24 13:11:19 compute-1 openstack_network_exporter[199352]: NOTICE  13:11:19 main.go:76: listening on https://:9105/metrics
Nov 24 13:11:19 compute-1 podman[199337]: 2025-11-24 13:11:19.486389341 +0000 UTC m=+0.183194706 container start 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 13:11:19 compute-1 podman[199337]: openstack_network_exporter
Nov 24 13:11:19 compute-1 systemd[1]: Started openstack_network_exporter container.
Nov 24 13:11:19 compute-1 sudo[199295]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:19 compute-1 podman[199362]: 2025-11-24 13:11:19.577624308 +0000 UTC m=+0.081679322 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9)
Nov 24 13:11:20 compute-1 sudo[199534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cearihudxdbqmgzsracjnzgvelwjgoci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989880.5388653-1269-148495954430632/AnsiballZ_systemd.py'
Nov 24 13:11:20 compute-1 sudo[199534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:21 compute-1 python3.9[199536]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:11:21 compute-1 systemd[1]: Stopping openstack_network_exporter container...
Nov 24 13:11:21 compute-1 systemd[1]: libpod-952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b.scope: Deactivated successfully.
Nov 24 13:11:21 compute-1 podman[199540]: 2025-11-24 13:11:21.244915879 +0000 UTC m=+0.079162249 container died 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350)
Nov 24 13:11:21 compute-1 systemd[1]: 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b-67af74c4428e8f8d.timer: Deactivated successfully.
Nov 24 13:11:21 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b.
Nov 24 13:11:21 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b-userdata-shm.mount: Deactivated successfully.
Nov 24 13:11:21 compute-1 systemd[1]: var-lib-containers-storage-overlay-a93d231694f426c4257a7cd44da4062a0eb2e86eda099414aea7fdcf6d58e5e1-merged.mount: Deactivated successfully.
Nov 24 13:11:21 compute-1 sshd-session[198297]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:11:21 compute-1 sshd-session[198297]: banner exchange: Connection from 218.56.160.82 port 45002: Connection timed out
Nov 24 13:11:21 compute-1 podman[199540]: 2025-11-24 13:11:21.906037338 +0000 UTC m=+0.740283738 container cleanup 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, release=1755695350, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 13:11:21 compute-1 podman[199540]: openstack_network_exporter
Nov 24 13:11:21 compute-1 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 24 13:11:21 compute-1 podman[199569]: openstack_network_exporter
Nov 24 13:11:21 compute-1 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 24 13:11:21 compute-1 systemd[1]: Stopped openstack_network_exporter container.
Nov 24 13:11:22 compute-1 systemd[1]: Starting openstack_network_exporter container...
Nov 24 13:11:22 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:11:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93d231694f426c4257a7cd44da4062a0eb2e86eda099414aea7fdcf6d58e5e1/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93d231694f426c4257a7cd44da4062a0eb2e86eda099414aea7fdcf6d58e5e1/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93d231694f426c4257a7cd44da4062a0eb2e86eda099414aea7fdcf6d58e5e1/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 13:11:22 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b.
Nov 24 13:11:22 compute-1 podman[199582]: 2025-11-24 13:11:22.140460784 +0000 UTC m=+0.124064717 container init 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *bridge.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *coverage.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *datapath.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *iface.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *memory.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *ovnnorthd.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *ovn.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *ovsdbserver.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *pmd_perf.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *pmd_rxq.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: INFO    13:11:22 main.go:48: registering *vswitch.Collector
Nov 24 13:11:22 compute-1 openstack_network_exporter[199599]: NOTICE  13:11:22 main.go:76: listening on https://:9105/metrics
Nov 24 13:11:22 compute-1 podman[199582]: 2025-11-24 13:11:22.168255888 +0000 UTC m=+0.151859781 container start 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal)
Nov 24 13:11:22 compute-1 podman[199582]: openstack_network_exporter
Nov 24 13:11:22 compute-1 systemd[1]: Started openstack_network_exporter container.
Nov 24 13:11:22 compute-1 sudo[199534]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:22 compute-1 podman[199609]: 2025-11-24 13:11:22.249725332 +0000 UTC m=+0.068953304 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Nov 24 13:11:24 compute-1 sshd-session[198155]: Received disconnect from 45.78.194.40 port 50322:11: Bye Bye [preauth]
Nov 24 13:11:24 compute-1 sshd-session[198155]: Disconnected from invalid user frontend 45.78.194.40 port 50322 [preauth]
Nov 24 13:11:25 compute-1 sudo[199779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjmlvtoqulvgkxpbyjjatnwjefgeretc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989885.150058-1286-189187034949946/AnsiballZ_find.py'
Nov 24 13:11:25 compute-1 sudo[199779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:25 compute-1 python3.9[199781]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 13:11:25 compute-1 sudo[199779]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:26 compute-1 sshd[128576]: drop connection #0 from [45.78.217.131]:43784 on [38.102.83.173]:22 penalty: exceeded LoginGraceTime
Nov 24 13:11:26 compute-1 sudo[199931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zenlqlhhlkxktjfxpnvfjmavchqxvlom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989886.0852525-1303-93932219712039/AnsiballZ_podman_container_info.py'
Nov 24 13:11:26 compute-1 sudo[199931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:26 compute-1 python3.9[199933]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 24 13:11:26 compute-1 sudo[199931]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:27 compute-1 sudo[200097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psbgcnvqotandepktfqhkqnxkxjxlgxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989887.1303318-1311-130714838042974/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:27 compute-1 sudo[200097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:27 compute-1 python3.9[200099]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:27 compute-1 systemd[1]: Started libpod-conmon-e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321.scope.
Nov 24 13:11:27 compute-1 podman[200100]: 2025-11-24 13:11:27.940713563 +0000 UTC m=+0.135319492 container exec e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 24 13:11:28 compute-1 podman[200119]: 2025-11-24 13:11:28.01704142 +0000 UTC m=+0.057475903 container exec_died e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 24 13:11:28 compute-1 podman[200100]: 2025-11-24 13:11:28.032770354 +0000 UTC m=+0.227376263 container exec_died e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 13:11:28 compute-1 systemd[1]: libpod-conmon-e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321.scope: Deactivated successfully.
Nov 24 13:11:28 compute-1 sudo[200097]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:28 compute-1 sudo[200281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkczirnbknrivftkbbmvjgeqriihdkfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989888.2718368-1319-269759191597197/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:28 compute-1 sudo[200281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:28 compute-1 python3.9[200283]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:28 compute-1 systemd[1]: Started libpod-conmon-e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321.scope.
Nov 24 13:11:28 compute-1 podman[200284]: 2025-11-24 13:11:28.7930676 +0000 UTC m=+0.072632231 container exec e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 13:11:28 compute-1 podman[200284]: 2025-11-24 13:11:28.82387785 +0000 UTC m=+0.103442471 container exec_died e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 24 13:11:28 compute-1 systemd[1]: libpod-conmon-e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321.scope: Deactivated successfully.
Nov 24 13:11:28 compute-1 sudo[200281]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:29 compute-1 sudo[200466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loxvvgfduaapzqabibrzwgylxfshtome ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989889.0685675-1327-149719482556970/AnsiballZ_file.py'
Nov 24 13:11:29 compute-1 sudo[200466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:29 compute-1 python3.9[200468]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:29 compute-1 sudo[200466]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:30 compute-1 sudo[200618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usblumgqhnotuoyjekaowwhicfolnsuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989890.077106-1336-148138588197726/AnsiballZ_podman_container_info.py'
Nov 24 13:11:30 compute-1 sudo[200618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:30 compute-1 python3.9[200620]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 24 13:11:30 compute-1 sudo[200618]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:31 compute-1 sudo[200783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kverqhvqvrwhsqbmeytnszkpzumsnkrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989890.814435-1344-200632158525265/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:31 compute-1 sudo[200783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:31 compute-1 python3.9[200785]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:31 compute-1 systemd[1]: Started libpod-conmon-a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4.scope.
Nov 24 13:11:31 compute-1 podman[200786]: 2025-11-24 13:11:31.507927239 +0000 UTC m=+0.075908632 container exec a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:11:31 compute-1 podman[200786]: 2025-11-24 13:11:31.542222338 +0000 UTC m=+0.110203711 container exec_died a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:11:31 compute-1 systemd[1]: libpod-conmon-a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4.scope: Deactivated successfully.
Nov 24 13:11:31 compute-1 sudo[200783]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:32 compute-1 sudo[200965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsddsiaocxxfxxzmlsolkfrcntsjykds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989891.7824628-1352-119797755048148/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:32 compute-1 sudo[200965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:32 compute-1 python3.9[200967]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:32 compute-1 systemd[1]: Started libpod-conmon-a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4.scope.
Nov 24 13:11:32 compute-1 podman[200968]: 2025-11-24 13:11:32.454994542 +0000 UTC m=+0.072026434 container exec a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:11:32 compute-1 podman[200968]: 2025-11-24 13:11:32.489299832 +0000 UTC m=+0.106331694 container exec_died a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:11:32 compute-1 systemd[1]: libpod-conmon-a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4.scope: Deactivated successfully.
Nov 24 13:11:32 compute-1 sudo[200965]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:33 compute-1 sudo[201150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lahfklkedvwjiauzfzulalingnfqjeyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989892.7554452-1360-18206919801594/AnsiballZ_file.py'
Nov 24 13:11:33 compute-1 sudo[201150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:33 compute-1 python3.9[201152]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:33 compute-1 sudo[201150]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:33 compute-1 sudo[201302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpupgabqvdugniapzlmagsqpykmsiczn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989893.6725676-1369-280070372239876/AnsiballZ_podman_container_info.py'
Nov 24 13:11:33 compute-1 sudo[201302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:34 compute-1 python3.9[201304]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 24 13:11:34 compute-1 sudo[201302]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:34 compute-1 sudo[201467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emzghyagmwoxeorthvgiwtqdeweirdca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989894.468152-1377-96097428493644/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:34 compute-1 sudo[201467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:35 compute-1 python3.9[201469]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:35 compute-1 systemd[1]: Started libpod-conmon-32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151.scope.
Nov 24 13:11:35 compute-1 podman[201470]: 2025-11-24 13:11:35.130246048 +0000 UTC m=+0.073167496 container exec 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:11:35 compute-1 podman[201470]: 2025-11-24 13:11:35.165146404 +0000 UTC m=+0.108067842 container exec_died 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:11:35 compute-1 systemd[1]: libpod-conmon-32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151.scope: Deactivated successfully.
Nov 24 13:11:35 compute-1 sudo[201467]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:35 compute-1 sudo[201652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ityksuilwlcvfllzibldoiwcgsrcqaqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989895.3977435-1385-2016535666068/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:35 compute-1 sudo[201652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:35 compute-1 python3.9[201654]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:35 compute-1 systemd[1]: Started libpod-conmon-32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151.scope.
Nov 24 13:11:35 compute-1 podman[201655]: 2025-11-24 13:11:35.980996355 +0000 UTC m=+0.079416139 container exec 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:11:36 compute-1 podman[201655]: 2025-11-24 13:11:36.014317677 +0000 UTC m=+0.112737451 container exec_died 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:11:36 compute-1 systemd[1]: libpod-conmon-32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151.scope: Deactivated successfully.
Nov 24 13:11:36 compute-1 sudo[201652]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:36 compute-1 sudo[201837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhemjscwocwlxdyuanfttsicmxlnawhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989896.2618947-1393-104553971880736/AnsiballZ_file.py'
Nov 24 13:11:36 compute-1 sudo[201837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:36 compute-1 python3.9[201839]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:36 compute-1 sudo[201837]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:37 compute-1 sshd-session[199934]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:11:37 compute-1 sshd-session[199934]: banner exchange: Connection from 218.56.160.82 port 46219: Connection timed out
Nov 24 13:11:37 compute-1 sudo[202001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znrdfilzylkbjlbucezgsyocissyvdnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989897.161254-1402-43109879290505/AnsiballZ_podman_container_info.py'
Nov 24 13:11:37 compute-1 sudo[202001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:37 compute-1 podman[201969]: 2025-11-24 13:11:37.513975616 +0000 UTC m=+0.056950248 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:11:37 compute-1 python3.9[202008]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 24 13:11:37 compute-1 sudo[202001]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:38 compute-1 sudo[202180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aamlyboveooamcvfvbilhouzwicwvqdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989897.9306962-1410-14653859161290/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:38 compute-1 sudo[202180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:38 compute-1 python3.9[202182]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:38 compute-1 systemd[1]: Started libpod-conmon-72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b.scope.
Nov 24 13:11:38 compute-1 podman[202183]: 2025-11-24 13:11:38.480451036 +0000 UTC m=+0.073271290 container exec 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:11:38 compute-1 podman[202183]: 2025-11-24 13:11:38.514177519 +0000 UTC m=+0.106997753 container exec_died 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:11:38 compute-1 systemd[1]: libpod-conmon-72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b.scope: Deactivated successfully.
Nov 24 13:11:38 compute-1 sudo[202180]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:38 compute-1 sshd-session[201864]: Received disconnect from 175.100.24.139 port 36602:11: Bye Bye [preauth]
Nov 24 13:11:38 compute-1 sshd-session[201864]: Disconnected from authenticating user root 175.100.24.139 port 36602 [preauth]
Nov 24 13:11:39 compute-1 sudo[202364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsnajusnkrwlppgaddxbsgjjlpevsgmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989898.8624873-1418-1406668512531/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:39 compute-1 sudo[202364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:39 compute-1 python3.9[202366]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:39 compute-1 systemd[1]: Started libpod-conmon-72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b.scope.
Nov 24 13:11:39 compute-1 podman[202367]: 2025-11-24 13:11:39.484074964 +0000 UTC m=+0.071058758 container exec 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:11:39 compute-1 podman[202386]: 2025-11-24 13:11:39.572030028 +0000 UTC m=+0.074308337 container exec_died 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:11:39 compute-1 podman[202367]: 2025-11-24 13:11:39.6074905 +0000 UTC m=+0.194474264 container exec_died 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:11:39 compute-1 systemd[1]: libpod-conmon-72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b.scope: Deactivated successfully.
Nov 24 13:11:39 compute-1 sudo[202364]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:39 compute-1 sshd-session[202245]: Invalid user desliga from 5.198.176.28 port 42862
Nov 24 13:11:39 compute-1 sshd-session[202245]: Received disconnect from 5.198.176.28 port 42862:11: Bye Bye [preauth]
Nov 24 13:11:39 compute-1 sshd-session[202245]: Disconnected from invalid user desliga 5.198.176.28 port 42862 [preauth]
Nov 24 13:11:40 compute-1 sudo[202548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llobmzqkufbqerybjexxltbmleccbppm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989899.855418-1426-124557502068140/AnsiballZ_file.py'
Nov 24 13:11:40 compute-1 sudo[202548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:40 compute-1 python3.9[202550]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:40 compute-1 sudo[202548]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:40 compute-1 sudo[202712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lguphjwnhslgdzpkhwkonbmdscruooyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989900.6507738-1435-248623909779274/AnsiballZ_podman_container_info.py'
Nov 24 13:11:40 compute-1 sudo[202712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:40 compute-1 podman[202674]: 2025-11-24 13:11:40.953522515 +0000 UTC m=+0.053128032 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 24 13:11:41 compute-1 python3.9[202720]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 24 13:11:41 compute-1 sudo[202712]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:41 compute-1 sudo[202883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgcqmatidbsekfagqsymohzaazkyfhfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989901.3920395-1443-63436276443581/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:41 compute-1 sudo[202883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:41 compute-1 python3.9[202885]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:42 compute-1 systemd[1]: Started libpod-conmon-952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b.scope.
Nov 24 13:11:42 compute-1 podman[202886]: 2025-11-24 13:11:42.068730232 +0000 UTC m=+0.149140819 container exec 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Nov 24 13:11:42 compute-1 podman[202886]: 2025-11-24 13:11:42.107189566 +0000 UTC m=+0.187600133 container exec_died 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git)
Nov 24 13:11:42 compute-1 systemd[1]: libpod-conmon-952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b.scope: Deactivated successfully.
Nov 24 13:11:42 compute-1 sudo[202883]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:42 compute-1 sudo[203069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjdlcvfskcntmsrmcyerafhfewjtfbwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989902.4153254-1451-199418753900558/AnsiballZ_podman_container_exec.py'
Nov 24 13:11:42 compute-1 sudo[203069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:42 compute-1 python3.9[203071]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 13:11:43 compute-1 systemd[1]: Started libpod-conmon-952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b.scope.
Nov 24 13:11:43 compute-1 podman[203072]: 2025-11-24 13:11:43.256092086 +0000 UTC m=+0.314074844 container exec 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 13:11:43 compute-1 podman[203091]: 2025-11-24 13:11:43.333067017 +0000 UTC m=+0.064331552 container exec_died 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 24 13:11:43 compute-1 podman[203072]: 2025-11-24 13:11:43.357867024 +0000 UTC m=+0.415849772 container exec_died 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, release=1755695350, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Nov 24 13:11:43 compute-1 systemd[1]: libpod-conmon-952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b.scope: Deactivated successfully.
Nov 24 13:11:43 compute-1 sudo[203069]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:44 compute-1 sudo[203254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbkwmsdsazfgacljtjrdqbppgukxvyll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989903.7685454-1459-197300904334135/AnsiballZ_file.py'
Nov 24 13:11:44 compute-1 sudo[203254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:11:44 compute-1 python3.9[203256]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:11:44 compute-1 sudo[203254]: pam_unix(sudo:session): session closed for user root
Nov 24 13:11:45 compute-1 podman[203281]: 2025-11-24 13:11:45.558942444 +0000 UTC m=+0.103822934 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 13:11:45 compute-1 podman[203282]: 2025-11-24 13:11:45.587100133 +0000 UTC m=+0.119725214 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 24 13:11:52 compute-1 podman[203327]: 2025-11-24 13:11:52.51514257 +0000 UTC m=+0.068470897 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 13:11:54 compute-1 sshd-session[203152]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:11:54 compute-1 sshd-session[203152]: banner exchange: Connection from 218.56.160.82 port 12053: Connection timed out
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.003 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.023 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.023 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.023 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.024 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.024 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.042 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.043 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.043 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.043 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.188 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.189 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6075MB free_disk=73.49846649169922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.189 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.190 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.263 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.264 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.287 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.300 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.301 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.301 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.943 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.944 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.944 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.968 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.968 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:58 compute-1 nova_compute[187078]: 2025-11-24 13:11:58.968 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:59 compute-1 sshd-session[203350]: Received disconnect from 176.114.89.34 port 58910:11: Bye Bye [preauth]
Nov 24 13:11:59 compute-1 sshd-session[203350]: Disconnected from authenticating user root 176.114.89.34 port 58910 [preauth]
Nov 24 13:11:59 compute-1 nova_compute[187078]: 2025-11-24 13:11:59.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:11:59 compute-1 nova_compute[187078]: 2025-11-24 13:11:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:12:02 compute-1 sshd-session[203353]: Received disconnect from 85.209.134.43 port 42936:11: Bye Bye [preauth]
Nov 24 13:12:02 compute-1 sshd-session[203353]: Disconnected from authenticating user root 85.209.134.43 port 42936 [preauth]
Nov 24 13:12:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:12:04.139 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:12:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:12:04.140 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:12:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:12:04.140 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:12:04 compute-1 sudo[203480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amvsasvoqlxkgdaimlrpheyztdhymxyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989924.614039-1635-259430677930316/AnsiballZ_file.py'
Nov 24 13:12:04 compute-1 sudo[203480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:05 compute-1 python3.9[203482]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:05 compute-1 sudo[203480]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:05 compute-1 sudo[203632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaixxdhttsgnrzotvalswotgyxsdopgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989925.353216-1651-20207430274700/AnsiballZ_stat.py'
Nov 24 13:12:05 compute-1 sudo[203632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:05 compute-1 python3.9[203634]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:05 compute-1 sudo[203632]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:06 compute-1 sudo[203755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtkxcjybjafxdvnthgnxqqwkinhisgvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989925.353216-1651-20207430274700/AnsiballZ_copy.py'
Nov 24 13:12:06 compute-1 sudo[203755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:06 compute-1 python3.9[203757]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763989925.353216-1651-20207430274700/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:06 compute-1 sudo[203755]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:07 compute-1 sudo[203907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlksswesruzwdewimwdcafxkltrvrkfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989926.8040414-1683-23817979999875/AnsiballZ_file.py'
Nov 24 13:12:07 compute-1 sudo[203907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:07 compute-1 python3.9[203909]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:07 compute-1 sudo[203907]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:07 compute-1 sudo[204076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbazypqkrczfrbzetimerkucakanvlnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989927.6018014-1700-96437932511886/AnsiballZ_stat.py'
Nov 24 13:12:07 compute-1 sudo[204076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:07 compute-1 podman[204033]: 2025-11-24 13:12:07.920898802 +0000 UTC m=+0.063555421 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:12:08 compute-1 python3.9[204085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:08 compute-1 sudo[204076]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:08 compute-1 sudo[204161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjayeejbyhyuxafqttfssvbrusesoegr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989927.6018014-1700-96437932511886/AnsiballZ_file.py'
Nov 24 13:12:08 compute-1 sudo[204161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:08 compute-1 python3.9[204163]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:08 compute-1 sudo[204161]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:09 compute-1 sudo[204313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxftdlshmgwkidcenopuxeeumraldnms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989928.8633764-1723-236247751399475/AnsiballZ_stat.py'
Nov 24 13:12:09 compute-1 sudo[204313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:09 compute-1 python3.9[204315]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:09 compute-1 sudo[204313]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:09 compute-1 sudo[204391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydqqgpfpsniemlxdgbxkifdvyqftqhav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989928.8633764-1723-236247751399475/AnsiballZ_file.py'
Nov 24 13:12:09 compute-1 sudo[204391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:09 compute-1 python3.9[204393]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mdgu9zfr recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:09 compute-1 sudo[204391]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:09 compute-1 sshd-session[203352]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:12:09 compute-1 sshd-session[203352]: banner exchange: Connection from 218.56.160.82 port 48314: Connection timed out
Nov 24 13:12:10 compute-1 sudo[204543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnqbwakcvstnggyiskbhkgrwlzyqiyjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989930.1069736-1747-47946273600976/AnsiballZ_stat.py'
Nov 24 13:12:10 compute-1 sudo[204543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:10 compute-1 python3.9[204545]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:10 compute-1 sudo[204543]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:10 compute-1 sudo[204621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwppuqyzuorsdnhiyqrukwvaoyxregia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989930.1069736-1747-47946273600976/AnsiballZ_file.py'
Nov 24 13:12:10 compute-1 sudo[204621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:11 compute-1 python3.9[204623]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:11 compute-1 sudo[204621]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:11 compute-1 podman[204648]: 2025-11-24 13:12:11.558137773 +0000 UTC m=+0.094214488 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:12:11 compute-1 sudo[204790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wakcvobnwfeisaqdvfkaodftmqbhdldd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989931.5654056-1773-49064617320997/AnsiballZ_command.py'
Nov 24 13:12:11 compute-1 sudo[204790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:12 compute-1 python3.9[204792]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:12:12 compute-1 sudo[204790]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:12 compute-1 sudo[204943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvkmtjhbapoiqmaubzgzzoqimmitedwq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763989932.3484437-1789-142558582709796/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 13:12:12 compute-1 sudo[204943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:12 compute-1 python3[204945]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 13:12:13 compute-1 sudo[204943]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:13 compute-1 sudo[205095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmuilokdrpdchzrnxbdbvcmwypslddfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989933.3067193-1805-32195164855158/AnsiballZ_stat.py'
Nov 24 13:12:13 compute-1 sudo[205095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:13 compute-1 python3.9[205097]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:13 compute-1 sudo[205095]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:14 compute-1 sudo[205173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axblldowbvfvwsulrlyccbcvxqewqirt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989933.3067193-1805-32195164855158/AnsiballZ_file.py'
Nov 24 13:12:14 compute-1 sudo[205173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:14 compute-1 python3.9[205175]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:14 compute-1 sudo[205173]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:14 compute-1 sudo[205325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boyxqduyjuhnkhqzmdnohsfwvpfmmtjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989934.6207426-1829-47037138656372/AnsiballZ_stat.py'
Nov 24 13:12:14 compute-1 sudo[205325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:15 compute-1 python3.9[205327]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:15 compute-1 sudo[205325]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:15 compute-1 sudo[205403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdqjqvlqekwnqfemdmskdjxcplaneowp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989934.6207426-1829-47037138656372/AnsiballZ_file.py'
Nov 24 13:12:15 compute-1 sudo[205403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:15 compute-1 python3.9[205405]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:15 compute-1 sudo[205403]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:16 compute-1 sudo[205584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmuliimmzjjuatcmlixzwyteephkkydf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989936.0198848-1854-165669800834459/AnsiballZ_stat.py'
Nov 24 13:12:16 compute-1 sudo[205584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:16 compute-1 podman[205529]: 2025-11-24 13:12:16.379637343 +0000 UTC m=+0.064863976 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 13:12:16 compute-1 podman[205530]: 2025-11-24 13:12:16.425904784 +0000 UTC m=+0.101993344 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 24 13:12:16 compute-1 python3.9[205594]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:16 compute-1 sudo[205584]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:16 compute-1 sudo[205677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmzwnjmlcwwjrnawpvhfsbuprsvrazjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989936.0198848-1854-165669800834459/AnsiballZ_file.py'
Nov 24 13:12:16 compute-1 sudo[205677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:17 compute-1 python3.9[205679]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:17 compute-1 sudo[205677]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:17 compute-1 sudo[205830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppvkhxswneghgknzufluxaqlxtmyfkye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989937.4069135-1877-213984818797224/AnsiballZ_stat.py'
Nov 24 13:12:17 compute-1 sudo[205830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:18 compute-1 python3.9[205832]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:18 compute-1 sudo[205830]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:18 compute-1 sudo[205908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzoykmssqibwcckyqganzeacfxvrpvha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989937.4069135-1877-213984818797224/AnsiballZ_file.py'
Nov 24 13:12:18 compute-1 sudo[205908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:18 compute-1 python3.9[205910]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:18 compute-1 sudo[205908]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:19 compute-1 sudo[206060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlbrhglwwwxtteftwkvwiizvsgxnuvvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989938.9106483-1901-105241080616779/AnsiballZ_stat.py'
Nov 24 13:12:19 compute-1 sudo[206060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:19 compute-1 python3.9[206062]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 13:12:19 compute-1 sudo[206060]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:19 compute-1 sudo[206185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moizevurjysotzuzjycbrawndvjdryis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989938.9106483-1901-105241080616779/AnsiballZ_copy.py'
Nov 24 13:12:19 compute-1 sudo[206185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:20 compute-1 python3.9[206187]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763989938.9106483-1901-105241080616779/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:20 compute-1 sudo[206185]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:20 compute-1 sudo[206339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvlvkrlclyxuewsaxneqwgpsjpasqprb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989940.364651-1931-174125105478943/AnsiballZ_file.py'
Nov 24 13:12:20 compute-1 sudo[206339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:20 compute-1 python3.9[206341]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:20 compute-1 sudo[206339]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:21 compute-1 sudo[206491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqaiyvwilwkqefcbanujyfywsuofrvei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989941.1882005-1948-54058706292090/AnsiballZ_command.py'
Nov 24 13:12:21 compute-1 sudo[206491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:21 compute-1 python3.9[206493]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:12:21 compute-1 sudo[206491]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:21 compute-1 sshd-session[206287]: Invalid user ftpuser1 from 68.183.82.237 port 51452
Nov 24 13:12:22 compute-1 sshd-session[206287]: Received disconnect from 68.183.82.237 port 51452:11: Bye Bye [preauth]
Nov 24 13:12:22 compute-1 sshd-session[206287]: Disconnected from invalid user ftpuser1 68.183.82.237 port 51452 [preauth]
Nov 24 13:12:22 compute-1 sudo[206646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjbdiuadtuwwudgavpanaccaelcpowbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989942.0753877-1963-121706457924481/AnsiballZ_blockinfile.py'
Nov 24 13:12:22 compute-1 sudo[206646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:22 compute-1 podman[206648]: 2025-11-24 13:12:22.678142785 +0000 UTC m=+0.082011872 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Nov 24 13:12:22 compute-1 python3.9[206649]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:22 compute-1 sudo[206646]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:23 compute-1 sudo[206819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxqaijcihosbvuegedaapoepkmzdujqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989943.160832-1981-153278555180341/AnsiballZ_command.py'
Nov 24 13:12:23 compute-1 sudo[206819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:23 compute-1 python3.9[206821]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:12:23 compute-1 sudo[206819]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:24 compute-1 sudo[206972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eutmgryttweredtiuvzzcuvroqubdllc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989943.9429739-1997-93676211624166/AnsiballZ_stat.py'
Nov 24 13:12:24 compute-1 sudo[206972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:24 compute-1 python3.9[206974]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 13:12:24 compute-1 sudo[206972]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:25 compute-1 sudo[207126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onzwmoiwdjzgoquhenlmjmmugtuiwmqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989944.7089677-2013-185574473615836/AnsiballZ_command.py'
Nov 24 13:12:25 compute-1 sudo[207126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:25 compute-1 python3.9[207128]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:12:25 compute-1 sudo[207126]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:25 compute-1 sudo[207281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nikeiddjqfhkcaddsjtncmlywpotndzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763989945.5890563-2029-238784341444823/AnsiballZ_file.py'
Nov 24 13:12:25 compute-1 sudo[207281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:12:26 compute-1 python3.9[207283]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:12:26 compute-1 sudo[207281]: pam_unix(sudo:session): session closed for user root
Nov 24 13:12:26 compute-1 openstack_network_exporter[199599]: ERROR   13:12:26 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:12:26 compute-1 openstack_network_exporter[199599]: ERROR   13:12:26 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:12:26 compute-1 openstack_network_exporter[199599]: ERROR   13:12:26 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:12:26 compute-1 openstack_network_exporter[199599]: ERROR   13:12:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:12:26 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:12:26 compute-1 openstack_network_exporter[199599]: ERROR   13:12:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:12:26 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:12:26 compute-1 sshd-session[187387]: Connection closed by 192.168.122.30 port 50190
Nov 24 13:12:26 compute-1 sshd-session[187384]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:12:26 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Nov 24 13:12:26 compute-1 systemd[1]: session-27.scope: Consumed 1min 23.998s CPU time.
Nov 24 13:12:26 compute-1 systemd-logind[815]: Session 27 logged out. Waiting for processes to exit.
Nov 24 13:12:26 compute-1 systemd-logind[815]: Removed session 27.
Nov 24 13:12:27 compute-1 sshd-session[205703]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:12:27 compute-1 sshd-session[205703]: banner exchange: Connection from 218.56.160.82 port 10720: Connection timed out
Nov 24 13:12:35 compute-1 podman[197429]: time="2025-11-24T13:12:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:12:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:12:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:12:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:12:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2561 "" "Go-http-client/1.1"
Nov 24 13:12:38 compute-1 podman[207316]: 2025-11-24 13:12:38.53649331 +0000 UTC m=+0.068062370 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:12:42 compute-1 podman[207342]: 2025-11-24 13:12:42.505196581 +0000 UTC m=+0.056445805 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:12:43 compute-1 sshd-session[207314]: error: kex_exchange_identification: read: Connection timed out
Nov 24 13:12:43 compute-1 sshd-session[207314]: banner exchange: Connection from 218.56.160.82 port 11805: Connection timed out
Nov 24 13:12:46 compute-1 podman[207361]: 2025-11-24 13:12:46.501585435 +0000 UTC m=+0.050257823 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 13:12:46 compute-1 podman[207362]: 2025-11-24 13:12:46.543586936 +0000 UTC m=+0.084039085 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 13:12:49 compute-1 openstack_network_exporter[199599]: ERROR   13:12:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:12:49 compute-1 openstack_network_exporter[199599]: ERROR   13:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:12:49 compute-1 openstack_network_exporter[199599]: ERROR   13:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:12:49 compute-1 openstack_network_exporter[199599]: ERROR   13:12:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:12:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:12:49 compute-1 openstack_network_exporter[199599]: ERROR   13:12:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:12:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:12:51 compute-1 sshd-session[207407]: Invalid user hadoop from 5.198.176.28 port 42968
Nov 24 13:12:51 compute-1 sshd-session[207407]: Received disconnect from 5.198.176.28 port 42968:11: Bye Bye [preauth]
Nov 24 13:12:51 compute-1 sshd-session[207407]: Disconnected from invalid user hadoop 5.198.176.28 port 42968 [preauth]
Nov 24 13:12:53 compute-1 podman[207411]: 2025-11-24 13:12:53.52776176 +0000 UTC m=+0.077923975 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 13:12:53 compute-1 sshd-session[207409]: Invalid user web from 218.56.160.82 port 12985
Nov 24 13:12:53 compute-1 sshd-session[207409]: Received disconnect from 218.56.160.82 port 12985:11: Bye Bye [preauth]
Nov 24 13:12:53 compute-1 sshd-session[207409]: Disconnected from invalid user web 218.56.160.82 port 12985 [preauth]
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.669 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.693 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.871 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.872 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6104MB free_disk=73.49799346923828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.872 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.872 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.948 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.949 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:12:57 compute-1 nova_compute[187078]: 2025-11-24 13:12:57.995 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:12:58 compute-1 nova_compute[187078]: 2025-11-24 13:12:58.009 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:12:58 compute-1 nova_compute[187078]: 2025-11-24 13:12:58.011 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:12:58 compute-1 nova_compute[187078]: 2025-11-24 13:12:58.011 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.008 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.009 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.009 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.030 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.031 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.031 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.031 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:12:59 compute-1 nova_compute[187078]: 2025-11-24 13:12:59.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:13:00 compute-1 sshd-session[207434]: Invalid user sol from 45.148.10.240 port 47826
Nov 24 13:13:00 compute-1 nova_compute[187078]: 2025-11-24 13:13:00.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:13:00 compute-1 sshd-session[207434]: Connection closed by invalid user sol 45.148.10.240 port 47826 [preauth]
Nov 24 13:13:02 compute-1 sshd-session[207438]: Invalid user minecraft from 85.209.134.43 port 49824
Nov 24 13:13:02 compute-1 sshd-session[207438]: Received disconnect from 85.209.134.43 port 49824:11: Bye Bye [preauth]
Nov 24 13:13:02 compute-1 sshd-session[207438]: Disconnected from invalid user minecraft 85.209.134.43 port 49824 [preauth]
Nov 24 13:13:02 compute-1 sshd-session[207436]: Received disconnect from 175.100.24.139 port 38878:11: Bye Bye [preauth]
Nov 24 13:13:02 compute-1 sshd-session[207436]: Disconnected from authenticating user root 175.100.24.139 port 38878 [preauth]
Nov 24 13:13:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:13:04.140 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:13:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:13:04.141 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:13:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:13:04.141 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:13:05 compute-1 podman[197429]: time="2025-11-24T13:13:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:13:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:13:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:13:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:13:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2566 "" "Go-http-client/1.1"
Nov 24 13:13:07 compute-1 sshd-session[207440]: Invalid user elemental from 176.114.89.34 port 35854
Nov 24 13:13:07 compute-1 sshd-session[207440]: Received disconnect from 176.114.89.34 port 35854:11: Bye Bye [preauth]
Nov 24 13:13:07 compute-1 sshd-session[207440]: Disconnected from invalid user elemental 176.114.89.34 port 35854 [preauth]
Nov 24 13:13:09 compute-1 podman[207442]: 2025-11-24 13:13:09.512350958 +0000 UTC m=+0.057381072 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:13:13 compute-1 podman[207468]: 2025-11-24 13:13:13.534663124 +0000 UTC m=+0.078685925 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 13:13:17 compute-1 podman[207487]: 2025-11-24 13:13:17.515393614 +0000 UTC m=+0.057381982 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:13:17 compute-1 podman[207488]: 2025-11-24 13:13:17.539725232 +0000 UTC m=+0.080776674 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:13:24 compute-1 podman[207534]: 2025-11-24 13:13:24.527086498 +0000 UTC m=+0.071425264 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm)
Nov 24 13:13:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:13:26.960 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:13:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:13:26.961 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:13:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:13:26.962 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:13:35 compute-1 podman[197429]: time="2025-11-24T13:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:13:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:13:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2567 "" "Go-http-client/1.1"
Nov 24 13:13:40 compute-1 sshd-session[207560]: Received disconnect from 68.183.82.237 port 49038:11: Bye Bye [preauth]
Nov 24 13:13:40 compute-1 sshd-session[207560]: Disconnected from authenticating user root 68.183.82.237 port 49038 [preauth]
Nov 24 13:13:40 compute-1 podman[207562]: 2025-11-24 13:13:40.515694718 +0000 UTC m=+0.062720759 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:13:44 compute-1 podman[207586]: 2025-11-24 13:13:44.5308818 +0000 UTC m=+0.072088801 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:13:48 compute-1 podman[207605]: 2025-11-24 13:13:48.506948865 +0000 UTC m=+0.059259183 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 24 13:13:48 compute-1 podman[207606]: 2025-11-24 13:13:48.541868744 +0000 UTC m=+0.086510907 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:13:49 compute-1 openstack_network_exporter[199599]: ERROR   13:13:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:13:49 compute-1 openstack_network_exporter[199599]: ERROR   13:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:13:49 compute-1 openstack_network_exporter[199599]: ERROR   13:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:13:49 compute-1 openstack_network_exporter[199599]: ERROR   13:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:13:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:13:49 compute-1 openstack_network_exporter[199599]: ERROR   13:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:13:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:13:55 compute-1 podman[207652]: 2025-11-24 13:13:55.534218773 +0000 UTC m=+0.070674163 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.692 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.836 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.837 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6153MB free_disk=73.49802780151367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.837 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.837 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.894 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.895 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.933 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.943 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.945 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:13:57 compute-1 nova_compute[187078]: 2025-11-24 13:13:57.946 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:13:58 compute-1 nova_compute[187078]: 2025-11-24 13:13:58.938 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:13:58 compute-1 nova_compute[187078]: 2025-11-24 13:13:58.956 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:13:58 compute-1 nova_compute[187078]: 2025-11-24 13:13:58.956 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:13:59 compute-1 sshd-session[207671]: Invalid user sonar from 45.78.217.131 port 45552
Nov 24 13:13:59 compute-1 sshd-session[207671]: Received disconnect from 45.78.217.131 port 45552:11: Bye Bye [preauth]
Nov 24 13:13:59 compute-1 sshd-session[207671]: Disconnected from invalid user sonar 45.78.217.131 port 45552 [preauth]
Nov 24 13:13:59 compute-1 nova_compute[187078]: 2025-11-24 13:13:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:13:59 compute-1 nova_compute[187078]: 2025-11-24 13:13:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:13:59 compute-1 nova_compute[187078]: 2025-11-24 13:13:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:00 compute-1 nova_compute[187078]: 2025-11-24 13:14:00.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:00 compute-1 nova_compute[187078]: 2025-11-24 13:14:00.665 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:00 compute-1 nova_compute[187078]: 2025-11-24 13:14:00.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:14:00 compute-1 nova_compute[187078]: 2025-11-24 13:14:00.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:14:00 compute-1 nova_compute[187078]: 2025-11-24 13:14:00.677 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:14:00 compute-1 nova_compute[187078]: 2025-11-24 13:14:00.678 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:00 compute-1 sshd-session[207673]: Received disconnect from 85.209.134.43 port 35816:11: Bye Bye [preauth]
Nov 24 13:14:00 compute-1 sshd-session[207673]: Disconnected from authenticating user root 85.209.134.43 port 35816 [preauth]
Nov 24 13:14:01 compute-1 nova_compute[187078]: 2025-11-24 13:14:01.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:02 compute-1 sshd-session[207675]: Received disconnect from 5.198.176.28 port 43078:11: Bye Bye [preauth]
Nov 24 13:14:02 compute-1 sshd-session[207675]: Disconnected from authenticating user root 5.198.176.28 port 43078 [preauth]
Nov 24 13:14:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:14:04.141 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:14:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:14:04.141 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:14:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:14:04.142 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:14:05 compute-1 podman[197429]: time="2025-11-24T13:14:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:14:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:14:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:14:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:14:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Nov 24 13:14:09 compute-1 sshd-session[207678]: Invalid user sol from 193.32.162.145 port 57290
Nov 24 13:14:09 compute-1 sshd-session[207678]: Connection closed by invalid user sol 193.32.162.145 port 57290 [preauth]
Nov 24 13:14:11 compute-1 podman[207680]: 2025-11-24 13:14:11.82212261 +0000 UTC m=+0.058737078 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:14:15 compute-1 podman[207706]: 2025-11-24 13:14:15.513204944 +0000 UTC m=+0.059412806 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 24 13:14:15 compute-1 sshd-session[207704]: Invalid user openbravo from 176.114.89.34 port 48582
Nov 24 13:14:15 compute-1 sshd-session[207704]: Received disconnect from 176.114.89.34 port 48582:11: Bye Bye [preauth]
Nov 24 13:14:15 compute-1 sshd-session[207704]: Disconnected from invalid user openbravo 176.114.89.34 port 48582 [preauth]
Nov 24 13:14:19 compute-1 openstack_network_exporter[199599]: ERROR   13:14:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:14:19 compute-1 openstack_network_exporter[199599]: ERROR   13:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:14:19 compute-1 openstack_network_exporter[199599]: ERROR   13:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:14:19 compute-1 openstack_network_exporter[199599]: ERROR   13:14:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:14:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:14:19 compute-1 openstack_network_exporter[199599]: ERROR   13:14:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:14:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:14:19 compute-1 podman[207725]: 2025-11-24 13:14:19.501878953 +0000 UTC m=+0.054288134 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 13:14:19 compute-1 podman[207726]: 2025-11-24 13:14:19.542777649 +0000 UTC m=+0.087522474 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller)
Nov 24 13:14:26 compute-1 podman[207773]: 2025-11-24 13:14:26.517421542 +0000 UTC m=+0.065949609 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 24 13:14:26 compute-1 sshd-session[207771]: Invalid user ts3 from 175.100.24.139 port 40996
Nov 24 13:14:27 compute-1 sshd-session[207771]: Received disconnect from 175.100.24.139 port 40996:11: Bye Bye [preauth]
Nov 24 13:14:27 compute-1 sshd-session[207771]: Disconnected from invalid user ts3 175.100.24.139 port 40996 [preauth]
Nov 24 13:14:35 compute-1 podman[197429]: time="2025-11-24T13:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:14:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:14:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Nov 24 13:14:42 compute-1 podman[207795]: 2025-11-24 13:14:42.505581355 +0000 UTC m=+0.050294010 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:14:46 compute-1 podman[207820]: 2025-11-24 13:14:46.561041525 +0000 UTC m=+0.061516208 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Nov 24 13:14:49 compute-1 openstack_network_exporter[199599]: ERROR   13:14:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:14:49 compute-1 openstack_network_exporter[199599]: ERROR   13:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:14:49 compute-1 openstack_network_exporter[199599]: ERROR   13:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:14:49 compute-1 openstack_network_exporter[199599]: ERROR   13:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:14:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:14:49 compute-1 openstack_network_exporter[199599]: ERROR   13:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:14:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:14:49 compute-1 sshd-session[207841]: Invalid user steam from 80.94.95.115 port 15126
Nov 24 13:14:49 compute-1 podman[207843]: 2025-11-24 13:14:49.681994505 +0000 UTC m=+0.063082780 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:14:49 compute-1 podman[207844]: 2025-11-24 13:14:49.713560399 +0000 UTC m=+0.090878581 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:14:50 compute-1 sshd-session[207841]: Connection closed by invalid user steam 80.94.95.115 port 15126 [preauth]
Nov 24 13:14:56 compute-1 sshd-session[207890]: Invalid user dockeruser from 68.183.82.237 port 59296
Nov 24 13:14:56 compute-1 sshd-session[207890]: Received disconnect from 68.183.82.237 port 59296:11: Bye Bye [preauth]
Nov 24 13:14:56 compute-1 sshd-session[207890]: Disconnected from invalid user dockeruser 68.183.82.237 port 59296 [preauth]
Nov 24 13:14:57 compute-1 podman[207892]: 2025-11-24 13:14:57.541267572 +0000 UTC m=+0.079286675 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64)
Nov 24 13:14:57 compute-1 nova_compute[187078]: 2025-11-24 13:14:57.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:57 compute-1 nova_compute[187078]: 2025-11-24 13:14:57.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:14:57 compute-1 nova_compute[187078]: 2025-11-24 13:14:57.683 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:14:57 compute-1 nova_compute[187078]: 2025-11-24 13:14:57.684 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:57 compute-1 nova_compute[187078]: 2025-11-24 13:14:57.684 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:14:57 compute-1 nova_compute[187078]: 2025-11-24 13:14:57.693 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:58 compute-1 nova_compute[187078]: 2025-11-24 13:14:58.703 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:58 compute-1 nova_compute[187078]: 2025-11-24 13:14:58.704 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:14:59 compute-1 sshd-session[207913]: Invalid user funded from 45.148.10.240 port 44616
Nov 24 13:14:59 compute-1 sshd-session[207913]: Connection closed by invalid user funded 45.148.10.240 port 44616 [preauth]
Nov 24 13:14:59 compute-1 sshd-session[207915]: Invalid user postgres from 85.209.134.43 port 57820
Nov 24 13:14:59 compute-1 sshd-session[207915]: Received disconnect from 85.209.134.43 port 57820:11: Bye Bye [preauth]
Nov 24 13:14:59 compute-1 sshd-session[207915]: Disconnected from invalid user postgres 85.209.134.43 port 57820 [preauth]
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.697 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.698 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.698 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.699 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.972 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.974 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6169MB free_disk=73.49727630615234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.974 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:14:59 compute-1 nova_compute[187078]: 2025-11-24 13:14:59.974 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.068 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.069 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.202 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.279 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.279 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.302 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.331 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.356 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.367 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.368 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:15:00 compute-1 nova_compute[187078]: 2025-11-24 13:15:00.369 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:15:01 compute-1 nova_compute[187078]: 2025-11-24 13:15:01.369 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:15:01 compute-1 nova_compute[187078]: 2025-11-24 13:15:01.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:15:01 compute-1 nova_compute[187078]: 2025-11-24 13:15:01.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:15:01 compute-1 nova_compute[187078]: 2025-11-24 13:15:01.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:15:01 compute-1 nova_compute[187078]: 2025-11-24 13:15:01.677 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:15:01 compute-1 nova_compute[187078]: 2025-11-24 13:15:01.677 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:15:02 compute-1 nova_compute[187078]: 2025-11-24 13:15:02.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:15:02 compute-1 nova_compute[187078]: 2025-11-24 13:15:02.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:15:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:15:04.142 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:15:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:15:04.143 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:15:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:15:04.143 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:15:05 compute-1 podman[197429]: time="2025-11-24T13:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:15:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:15:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Nov 24 13:15:09 compute-1 sshd-session[207919]: Invalid user gg from 5.198.176.28 port 43180
Nov 24 13:15:09 compute-1 sshd-session[207919]: Received disconnect from 5.198.176.28 port 43180:11: Bye Bye [preauth]
Nov 24 13:15:09 compute-1 sshd-session[207919]: Disconnected from invalid user gg 5.198.176.28 port 43180 [preauth]
Nov 24 13:15:11 compute-1 sshd[128576]: Timeout before authentication for connection from 218.56.160.82 to 38.102.83.173, pid = 207466
Nov 24 13:15:13 compute-1 podman[207921]: 2025-11-24 13:15:13.570897963 +0000 UTC m=+0.109477743 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:15:17 compute-1 podman[207945]: 2025-11-24 13:15:17.50047419 +0000 UTC m=+0.052624973 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 24 13:15:19 compute-1 openstack_network_exporter[199599]: ERROR   13:15:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:15:19 compute-1 openstack_network_exporter[199599]: ERROR   13:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:15:19 compute-1 openstack_network_exporter[199599]: ERROR   13:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:15:19 compute-1 openstack_network_exporter[199599]: ERROR   13:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:15:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:15:19 compute-1 openstack_network_exporter[199599]: ERROR   13:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:15:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:15:20 compute-1 podman[207967]: 2025-11-24 13:15:20.528620694 +0000 UTC m=+0.070044801 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Nov 24 13:15:20 compute-1 podman[207968]: 2025-11-24 13:15:20.564742674 +0000 UTC m=+0.099113187 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:15:21 compute-1 sshd-session[208011]: Invalid user user from 176.114.89.34 port 37578
Nov 24 13:15:21 compute-1 sshd-session[208011]: Received disconnect from 176.114.89.34 port 37578:11: Bye Bye [preauth]
Nov 24 13:15:21 compute-1 sshd-session[208011]: Disconnected from invalid user user 176.114.89.34 port 37578 [preauth]
Nov 24 13:15:28 compute-1 podman[208013]: 2025-11-24 13:15:28.514984724 +0000 UTC m=+0.056360516 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, release=1755695350)
Nov 24 13:15:35 compute-1 podman[197429]: time="2025-11-24T13:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:15:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:15:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Nov 24 13:15:44 compute-1 podman[208036]: 2025-11-24 13:15:44.515752873 +0000 UTC m=+0.058290828 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:15:48 compute-1 podman[208060]: 2025-11-24 13:15:48.503559275 +0000 UTC m=+0.051301597 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 13:15:49 compute-1 openstack_network_exporter[199599]: ERROR   13:15:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:15:49 compute-1 openstack_network_exporter[199599]: ERROR   13:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:15:49 compute-1 openstack_network_exporter[199599]: ERROR   13:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:15:49 compute-1 openstack_network_exporter[199599]: ERROR   13:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:15:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:15:49 compute-1 openstack_network_exporter[199599]: ERROR   13:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:15:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:15:51 compute-1 podman[208079]: 2025-11-24 13:15:51.507609697 +0000 UTC m=+0.054684425 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 24 13:15:51 compute-1 podman[208080]: 2025-11-24 13:15:51.534777658 +0000 UTC m=+0.069283192 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 13:15:54 compute-1 sshd-session[208125]: Invalid user bot from 175.100.24.139 port 43280
Nov 24 13:15:54 compute-1 sshd-session[208125]: Received disconnect from 175.100.24.139 port 43280:11: Bye Bye [preauth]
Nov 24 13:15:54 compute-1 sshd-session[208125]: Disconnected from invalid user bot 175.100.24.139 port 43280 [preauth]
Nov 24 13:15:55 compute-1 sshd-session[208127]: Invalid user astra from 85.209.134.43 port 44468
Nov 24 13:15:55 compute-1 sshd-session[208127]: Received disconnect from 85.209.134.43 port 44468:11: Bye Bye [preauth]
Nov 24 13:15:55 compute-1 sshd-session[208127]: Disconnected from invalid user astra 85.209.134.43 port 44468 [preauth]
Nov 24 13:15:59 compute-1 podman[208129]: 2025-11-24 13:15:59.531084497 +0000 UTC m=+0.070571829 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Nov 24 13:16:00 compute-1 nova_compute[187078]: 2025-11-24 13:16:00.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:00 compute-1 nova_compute[187078]: 2025-11-24 13:16:00.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:00 compute-1 nova_compute[187078]: 2025-11-24 13:16:00.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.692 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.840 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.841 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6183MB free_disk=73.49727630615234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.841 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.841 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.917 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.917 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.938 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.955 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.956 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:16:01 compute-1 nova_compute[187078]: 2025-11-24 13:16:01.956 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:16:02 compute-1 nova_compute[187078]: 2025-11-24 13:16:02.949 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:02 compute-1 nova_compute[187078]: 2025-11-24 13:16:02.970 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:02 compute-1 nova_compute[187078]: 2025-11-24 13:16:02.970 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:03 compute-1 nova_compute[187078]: 2025-11-24 13:16:03.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:03 compute-1 nova_compute[187078]: 2025-11-24 13:16:03.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:16:03 compute-1 nova_compute[187078]: 2025-11-24 13:16:03.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:16:03 compute-1 nova_compute[187078]: 2025-11-24 13:16:03.685 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:16:03 compute-1 nova_compute[187078]: 2025-11-24 13:16:03.685 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:16:04.143 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:16:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:16:04.144 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:16:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:16:04.144 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:16:04 compute-1 nova_compute[187078]: 2025-11-24 13:16:04.678 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:16:05 compute-1 podman[197429]: time="2025-11-24T13:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:16:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:16:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 24 13:16:10 compute-1 sshd-session[208151]: Invalid user packer from 68.183.82.237 port 53622
Nov 24 13:16:10 compute-1 sshd-session[208151]: Received disconnect from 68.183.82.237 port 53622:11: Bye Bye [preauth]
Nov 24 13:16:10 compute-1 sshd-session[208151]: Disconnected from invalid user packer 68.183.82.237 port 53622 [preauth]
Nov 24 13:16:14 compute-1 sshd-session[208153]: Invalid user administrator from 5.198.176.28 port 43284
Nov 24 13:16:14 compute-1 sshd-session[208153]: Received disconnect from 5.198.176.28 port 43284:11: Bye Bye [preauth]
Nov 24 13:16:14 compute-1 sshd-session[208153]: Disconnected from invalid user administrator 5.198.176.28 port 43284 [preauth]
Nov 24 13:16:15 compute-1 podman[208155]: 2025-11-24 13:16:15.541003945 +0000 UTC m=+0.081982159 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:16:19 compute-1 openstack_network_exporter[199599]: ERROR   13:16:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:16:19 compute-1 openstack_network_exporter[199599]: ERROR   13:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:16:19 compute-1 openstack_network_exporter[199599]: ERROR   13:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:16:19 compute-1 openstack_network_exporter[199599]: ERROR   13:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:16:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:16:19 compute-1 openstack_network_exporter[199599]: ERROR   13:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:16:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:16:19 compute-1 podman[208180]: 2025-11-24 13:16:19.522791205 +0000 UTC m=+0.065981302 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:16:22 compute-1 podman[208201]: 2025-11-24 13:16:22.506896194 +0000 UTC m=+0.058013134 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 13:16:22 compute-1 podman[208202]: 2025-11-24 13:16:22.58547627 +0000 UTC m=+0.122042493 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:16:28 compute-1 sshd-session[208249]: Received disconnect from 176.114.89.34 port 42378:11: Bye Bye [preauth]
Nov 24 13:16:28 compute-1 sshd-session[208249]: Disconnected from authenticating user root 176.114.89.34 port 42378 [preauth]
Nov 24 13:16:30 compute-1 podman[208253]: 2025-11-24 13:16:30.528749121 +0000 UTC m=+0.072842201 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 24 13:16:34 compute-1 sshd-session[208251]: Invalid user ionadmin from 45.78.194.40 port 48574
Nov 24 13:16:35 compute-1 podman[197429]: time="2025-11-24T13:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:16:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:16:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Nov 24 13:16:39 compute-1 sshd-session[208251]: Received disconnect from 45.78.194.40 port 48574:11: Bye Bye [preauth]
Nov 24 13:16:39 compute-1 sshd-session[208251]: Disconnected from invalid user ionadmin 45.78.194.40 port 48574 [preauth]
Nov 24 13:16:40 compute-1 sshd-session[208274]: Received disconnect from 193.46.255.7 port 37622:11:  [preauth]
Nov 24 13:16:40 compute-1 sshd-session[208274]: Disconnected from authenticating user root 193.46.255.7 port 37622 [preauth]
Nov 24 13:16:42 compute-1 sshd-session[208199]: Received disconnect from 45.78.217.131 port 42048:11: Bye Bye [preauth]
Nov 24 13:16:42 compute-1 sshd-session[208199]: Disconnected from authenticating user root 45.78.217.131 port 42048 [preauth]
Nov 24 13:16:46 compute-1 podman[208276]: 2025-11-24 13:16:46.526384024 +0000 UTC m=+0.069196841 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:16:49 compute-1 openstack_network_exporter[199599]: ERROR   13:16:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:16:49 compute-1 openstack_network_exporter[199599]: ERROR   13:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:16:49 compute-1 openstack_network_exporter[199599]: ERROR   13:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:16:49 compute-1 openstack_network_exporter[199599]: ERROR   13:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:16:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:16:49 compute-1 openstack_network_exporter[199599]: ERROR   13:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:16:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:16:50 compute-1 podman[208301]: 2025-11-24 13:16:50.496535778 +0000 UTC m=+0.049557585 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:16:52 compute-1 sshd-session[208320]: Invalid user alma from 85.209.134.43 port 50240
Nov 24 13:16:53 compute-1 sshd-session[208320]: Received disconnect from 85.209.134.43 port 50240:11: Bye Bye [preauth]
Nov 24 13:16:53 compute-1 sshd-session[208320]: Disconnected from invalid user alma 85.209.134.43 port 50240 [preauth]
Nov 24 13:16:53 compute-1 podman[208322]: 2025-11-24 13:16:53.060088557 +0000 UTC m=+0.063712212 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:16:53 compute-1 podman[208323]: 2025-11-24 13:16:53.167765122 +0000 UTC m=+0.166668948 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 24 13:16:57 compute-1 sshd-session[208365]: Invalid user sol from 45.148.10.240 port 57290
Nov 24 13:16:57 compute-1 sshd-session[208365]: Connection closed by invalid user sol 45.148.10.240 port 57290 [preauth]
Nov 24 13:17:01 compute-1 podman[208367]: 2025-11-24 13:17:01.534295178 +0000 UTC m=+0.075440591 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64)
Nov 24 13:17:02 compute-1 nova_compute[187078]: 2025-11-24 13:17:02.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:02 compute-1 nova_compute[187078]: 2025-11-24 13:17:02.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:02 compute-1 nova_compute[187078]: 2025-11-24 13:17:02.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:02 compute-1 nova_compute[187078]: 2025-11-24 13:17:02.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:02 compute-1 nova_compute[187078]: 2025-11-24 13:17:02.669 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.738 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.739 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.761 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.761 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.762 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.762 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.906 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.907 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6188MB free_disk=73.49727249145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.907 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.907 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.956 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:17:03 compute-1 nova_compute[187078]: 2025-11-24 13:17:03.957 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:17:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:03.966 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:17:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:03.967 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:17:04 compute-1 nova_compute[187078]: 2025-11-24 13:17:04.004 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:17:04 compute-1 nova_compute[187078]: 2025-11-24 13:17:04.014 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:17:04 compute-1 nova_compute[187078]: 2025-11-24 13:17:04.015 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:17:04 compute-1 nova_compute[187078]: 2025-11-24 13:17:04.015 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:04.144 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:04.145 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:04.145 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:04 compute-1 nova_compute[187078]: 2025-11-24 13:17:04.943 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:04 compute-1 nova_compute[187078]: 2025-11-24 13:17:04.943 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:05 compute-1 podman[197429]: time="2025-11-24T13:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:17:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:17:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 24 13:17:05 compute-1 nova_compute[187078]: 2025-11-24 13:17:05.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:17:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:05.969 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:17:17 compute-1 podman[208389]: 2025-11-24 13:17:17.510798329 +0000 UTC m=+0.055352055 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:17:19 compute-1 openstack_network_exporter[199599]: ERROR   13:17:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:17:19 compute-1 openstack_network_exporter[199599]: ERROR   13:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:17:19 compute-1 openstack_network_exporter[199599]: ERROR   13:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:17:19 compute-1 openstack_network_exporter[199599]: ERROR   13:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:17:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:17:19 compute-1 openstack_network_exporter[199599]: ERROR   13:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:17:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:17:21 compute-1 sshd-session[208413]: Invalid user debianuser from 5.198.176.28 port 43390
Nov 24 13:17:21 compute-1 podman[208415]: 2025-11-24 13:17:21.407842189 +0000 UTC m=+0.058392398 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 24 13:17:21 compute-1 sshd-session[208413]: Received disconnect from 5.198.176.28 port 43390:11: Bye Bye [preauth]
Nov 24 13:17:21 compute-1 sshd-session[208413]: Disconnected from invalid user debianuser 5.198.176.28 port 43390 [preauth]
Nov 24 13:17:23 compute-1 podman[208435]: 2025-11-24 13:17:23.513868134 +0000 UTC m=+0.065000708 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 24 13:17:23 compute-1 podman[208436]: 2025-11-24 13:17:23.558897757 +0000 UTC m=+0.106219548 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:17:25 compute-1 sshd-session[208479]: Invalid user postgres from 193.32.162.145 port 42570
Nov 24 13:17:25 compute-1 sshd-session[208479]: Connection closed by invalid user postgres 193.32.162.145 port 42570 [preauth]
Nov 24 13:17:30 compute-1 sshd-session[208481]: Invalid user sonar from 68.183.82.237 port 47872
Nov 24 13:17:30 compute-1 sshd-session[208481]: Received disconnect from 68.183.82.237 port 47872:11: Bye Bye [preauth]
Nov 24 13:17:30 compute-1 sshd-session[208481]: Disconnected from invalid user sonar 68.183.82.237 port 47872 [preauth]
Nov 24 13:17:32 compute-1 podman[208483]: 2025-11-24 13:17:32.517878805 +0000 UTC m=+0.063213129 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Nov 24 13:17:35 compute-1 podman[197429]: time="2025-11-24T13:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:17:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:17:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2566 "" "Go-http-client/1.1"
Nov 24 13:17:36 compute-1 nova_compute[187078]: 2025-11-24 13:17:36.989 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:36 compute-1 nova_compute[187078]: 2025-11-24 13:17:36.989 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.020 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.152 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.153 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.162 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.163 187082 INFO nova.compute.claims [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.303 187082 DEBUG nova.compute.provider_tree [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.316 187082 DEBUG nova.scheduler.client.report [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.339 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.340 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.385 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.386 187082 DEBUG nova.network.neutron [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.406 187082 INFO nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.424 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.527 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.529 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.529 187082 INFO nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Creating image(s)
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.530 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.531 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.531 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.532 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.532 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.969 187082 WARNING oslo_policy.policy [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.970 187082 WARNING oslo_policy.policy [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 24 13:17:37 compute-1 nova_compute[187078]: 2025-11-24 13:17:37.973 187082 DEBUG nova.policy [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bad71b4865594b828cc87e37a3107bc4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.177 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.226 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f.part --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.228 187082 DEBUG nova.virt.images [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] 1d4afc77-cb95-49a2-9165-f8ceca2998fc was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.229 187082 DEBUG nova.privsep.utils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.229 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f.part /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.439 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f.part /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f.converted" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.444 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.494 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f.converted --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.496 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.508 187082 INFO oslo.privsep.daemon [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpudiier9j/privsep.sock']
Nov 24 13:17:39 compute-1 nova_compute[187078]: 2025-11-24 13:17:39.893 187082 DEBUG nova.network.neutron [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Successfully created port: 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.133 187082 INFO oslo.privsep.daemon [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Spawned new privsep daemon via rootwrap
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.011 208526 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.015 208526 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.017 208526 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.018 208526 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208526
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.220 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.294 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.295 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.296 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.306 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.357 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.358 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.386 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.387 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.387 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.437 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.438 187082 DEBUG nova.virt.disk.api [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Checking if we can resize image /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.438 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.488 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.489 187082 DEBUG nova.virt.disk.api [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Cannot resize image /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.489 187082 DEBUG nova.objects.instance [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'migration_context' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.503 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.503 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Ensure instance console log exists: /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.503 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.504 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:40 compute-1 nova_compute[187078]: 2025-11-24 13:17:40.504 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:40 compute-1 sshd-session[208524]: Invalid user alma from 176.114.89.34 port 45522
Nov 24 13:17:40 compute-1 sshd-session[208524]: Received disconnect from 176.114.89.34 port 45522:11: Bye Bye [preauth]
Nov 24 13:17:40 compute-1 sshd-session[208524]: Disconnected from invalid user alma 176.114.89.34 port 45522 [preauth]
Nov 24 13:17:42 compute-1 nova_compute[187078]: 2025-11-24 13:17:42.145 187082 DEBUG nova.network.neutron [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Successfully updated port: 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:17:42 compute-1 nova_compute[187078]: 2025-11-24 13:17:42.160 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:17:42 compute-1 nova_compute[187078]: 2025-11-24 13:17:42.161 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquired lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:17:42 compute-1 nova_compute[187078]: 2025-11-24 13:17:42.162 187082 DEBUG nova.network.neutron [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:17:42 compute-1 nova_compute[187078]: 2025-11-24 13:17:42.323 187082 DEBUG nova.network.neutron [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:17:42 compute-1 nova_compute[187078]: 2025-11-24 13:17:42.642 187082 DEBUG nova.compute.manager [req-ece4dcef-3aca-4e7e-90cb-b9492d462a03 req-f99f5337-4097-437b-969d-be33d9f046ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-changed-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:17:42 compute-1 nova_compute[187078]: 2025-11-24 13:17:42.643 187082 DEBUG nova.compute.manager [req-ece4dcef-3aca-4e7e-90cb-b9492d462a03 req-f99f5337-4097-437b-969d-be33d9f046ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Refreshing instance network info cache due to event network-changed-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:17:42 compute-1 nova_compute[187078]: 2025-11-24 13:17:42.644 187082 DEBUG oslo_concurrency.lockutils [req-ece4dcef-3aca-4e7e-90cb-b9492d462a03 req-f99f5337-4097-437b-969d-be33d9f046ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.382 187082 DEBUG nova.network.neutron [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating instance_info_cache with network_info: [{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.469 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Releasing lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.470 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance network_info: |[{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.471 187082 DEBUG oslo_concurrency.lockutils [req-ece4dcef-3aca-4e7e-90cb-b9492d462a03 req-f99f5337-4097-437b-969d-be33d9f046ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.471 187082 DEBUG nova.network.neutron [req-ece4dcef-3aca-4e7e-90cb-b9492d462a03 req-f99f5337-4097-437b-969d-be33d9f046ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Refreshing network info cache for port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.478 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Start _get_guest_xml network_info=[{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.487 187082 WARNING nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.496 187082 DEBUG nova.virt.libvirt.host [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.497 187082 DEBUG nova.virt.libvirt.host [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.501 187082 DEBUG nova.virt.libvirt.host [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.502 187082 DEBUG nova.virt.libvirt.host [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.504 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.505 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.505 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.506 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.506 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.506 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.507 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.507 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.507 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.507 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.508 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.508 187082 DEBUG nova.virt.hardware [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.514 187082 DEBUG nova.privsep.utils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.515 187082 DEBUG nova.virt.libvirt.vif [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1684048061',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1684048061',id=1,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-yt51c9yn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:17:37Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=e949dac9-04e8-4bf5-b73c-32ab3fc59472,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.515 187082 DEBUG nova.network.os_vif_util [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.516 187082 DEBUG nova.network.os_vif_util [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.518 187082 DEBUG nova.objects.instance [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'pci_devices' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.537 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <uuid>e949dac9-04e8-4bf5-b73c-32ab3fc59472</uuid>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <name>instance-00000001</name>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1684048061</nova:name>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:17:43</nova:creationTime>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:17:43 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:17:43 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:17:43 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:17:43 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:17:43 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:17:43 compute-1 nova_compute[187078]:         <nova:user uuid="bad71b4865594b828cc87e37a3107bc4">tempest-TestExecuteActionsViaActuator-1613857129-project-member</nova:user>
Nov 24 13:17:43 compute-1 nova_compute[187078]:         <nova:project uuid="5383ea8abbd144f89d959d9b1f9c052f">tempest-TestExecuteActionsViaActuator-1613857129</nova:project>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:17:43 compute-1 nova_compute[187078]:         <nova:port uuid="3c9ecd74-5bbd-4ab3-ad59-929239c5a81e">
Nov 24 13:17:43 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <system>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <entry name="serial">e949dac9-04e8-4bf5-b73c-32ab3fc59472</entry>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <entry name="uuid">e949dac9-04e8-4bf5-b73c-32ab3fc59472</entry>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </system>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <os>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   </os>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <features>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   </features>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:10:b3:4c"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <target dev="tap3c9ecd74-5b"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/console.log" append="off"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <video>
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </video>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:17:43 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:17:43 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:17:43 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:17:43 compute-1 nova_compute[187078]: </domain>
Nov 24 13:17:43 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.539 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Preparing to wait for external event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.539 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.540 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.540 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.541 187082 DEBUG nova.virt.libvirt.vif [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1684048061',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1684048061',id=1,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-yt51c9yn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:17:37Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=e949dac9-04e8-4bf5-b73c-32ab3fc59472,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.541 187082 DEBUG nova.network.os_vif_util [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.542 187082 DEBUG nova.network.os_vif_util [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.542 187082 DEBUG os_vif [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.576 187082 DEBUG ovsdbapp.backend.ovs_idl [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.577 187082 DEBUG ovsdbapp.backend.ovs_idl [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.577 187082 DEBUG ovsdbapp.backend.ovs_idl [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.577 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.578 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.578 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.579 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.580 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.582 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.591 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.591 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.591 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:17:43 compute-1 nova_compute[187078]: 2025-11-24 13:17:43.593 187082 INFO oslo.privsep.daemon [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp72a42om2/privsep.sock']
Nov 24 13:17:44 compute-1 sshd-session[208543]: Invalid user openbravo from 175.100.24.139 port 45464
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.370 187082 INFO oslo.privsep.daemon [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Spawned new privsep daemon via rootwrap
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.208 208549 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.215 208549 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.219 208549 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.219 208549 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208549
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.568 187082 DEBUG nova.network.neutron [req-ece4dcef-3aca-4e7e-90cb-b9492d462a03 req-f99f5337-4097-437b-969d-be33d9f046ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updated VIF entry in instance network info cache for port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.569 187082 DEBUG nova.network.neutron [req-ece4dcef-3aca-4e7e-90cb-b9492d462a03 req-f99f5337-4097-437b-969d-be33d9f046ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating instance_info_cache with network_info: [{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.583 187082 DEBUG oslo_concurrency.lockutils [req-ece4dcef-3aca-4e7e-90cb-b9492d462a03 req-f99f5337-4097-437b-969d-be33d9f046ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.675 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.675 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c9ecd74-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.676 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c9ecd74-5b, col_values=(('external_ids', {'iface-id': '3c9ecd74-5bbd-4ab3-ad59-929239c5a81e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:b3:4c', 'vm-uuid': 'e949dac9-04e8-4bf5-b73c-32ab3fc59472'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.677 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:44 compute-1 NetworkManager[55527]: <info>  [1763990264.6789] manager: (tap3c9ecd74-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.680 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.685 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.685 187082 INFO os_vif [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b')
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.722 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.722 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.723 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No VIF found with MAC fa:16:3e:10:b3:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:17:44 compute-1 nova_compute[187078]: 2025-11-24 13:17:44.723 187082 INFO nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Using config drive
Nov 24 13:17:44 compute-1 sshd-session[208543]: Received disconnect from 175.100.24.139 port 45464:11: Bye Bye [preauth]
Nov 24 13:17:44 compute-1 sshd-session[208543]: Disconnected from invalid user openbravo 175.100.24.139 port 45464 [preauth]
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.359 187082 INFO nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Creating config drive at /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.364 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpikpwu4da execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.486 187082 DEBUG oslo_concurrency.processutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpikpwu4da" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:17:45 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 24 13:17:45 compute-1 kernel: tap3c9ecd74-5b: entered promiscuous mode
Nov 24 13:17:45 compute-1 NetworkManager[55527]: <info>  [1763990265.5911] manager: (tap3c9ecd74-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Nov 24 13:17:45 compute-1 ovn_controller[95368]: 2025-11-24T13:17:45Z|00027|binding|INFO|Claiming lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for this chassis.
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.590 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:45 compute-1 ovn_controller[95368]: 2025-11-24T13:17:45Z|00028|binding|INFO|3c9ecd74-5bbd-4ab3-ad59-929239c5a81e: Claiming fa:16:3e:10:b3:4c 10.100.0.13
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.597 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:45 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:45.617 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:b3:4c 10.100.0.13'], port_security=['fa:16:3e:10:b3:4c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e949dac9-04e8-4bf5-b73c-32ab3fc59472', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:17:45 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:45.619 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 bound to our chassis
Nov 24 13:17:45 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:45.624 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:17:45 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:45.625 104225 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpuxiasse0/privsep.sock']
Nov 24 13:17:45 compute-1 systemd-udevd[208574]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:17:45 compute-1 NetworkManager[55527]: <info>  [1763990265.6427] device (tap3c9ecd74-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:17:45 compute-1 NetworkManager[55527]: <info>  [1763990265.6456] device (tap3c9ecd74-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.679 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:45 compute-1 systemd-machined[153355]: New machine qemu-1-instance-00000001.
Nov 24 13:17:45 compute-1 ovn_controller[95368]: 2025-11-24T13:17:45Z|00029|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e ovn-installed in OVS
Nov 24 13:17:45 compute-1 ovn_controller[95368]: 2025-11-24T13:17:45Z|00030|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e up in Southbound
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.687 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:45 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.970 187082 DEBUG nova.compute.manager [req-2c85a05b-9401-4d73-85cd-1f084d311540 req-8986c625-217a-4cfe-91f2-59f6bbe6bec4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.970 187082 DEBUG oslo_concurrency.lockutils [req-2c85a05b-9401-4d73-85cd-1f084d311540 req-8986c625-217a-4cfe-91f2-59f6bbe6bec4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.970 187082 DEBUG oslo_concurrency.lockutils [req-2c85a05b-9401-4d73-85cd-1f084d311540 req-8986c625-217a-4cfe-91f2-59f6bbe6bec4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.970 187082 DEBUG oslo_concurrency.lockutils [req-2c85a05b-9401-4d73-85cd-1f084d311540 req-8986c625-217a-4cfe-91f2-59f6bbe6bec4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:45 compute-1 nova_compute[187078]: 2025-11-24 13:17:45.971 187082 DEBUG nova.compute.manager [req-2c85a05b-9401-4d73-85cd-1f084d311540 req-8986c625-217a-4cfe-91f2-59f6bbe6bec4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Processing event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.097 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990266.096539, e949dac9-04e8-4bf5-b73c-32ab3fc59472 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.097 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] VM Started (Lifecycle Event)
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.099 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.111 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.115 187082 INFO nova.virt.libvirt.driver [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance spawned successfully.
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.115 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.136 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.139 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.145 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.146 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.146 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.147 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.147 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.148 187082 DEBUG nova.virt.libvirt.driver [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.167 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.167 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990266.0967383, e949dac9-04e8-4bf5-b73c-32ab3fc59472 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.168 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] VM Paused (Lifecycle Event)
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.184 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.192 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990266.1111088, e949dac9-04e8-4bf5-b73c-32ab3fc59472 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.193 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] VM Resumed (Lifecycle Event)
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.213 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.216 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.233 187082 INFO nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Took 8.70 seconds to spawn the instance on the hypervisor.
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.234 187082 DEBUG nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.236 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.289 104225 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.289 104225 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpuxiasse0/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.184 208599 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.189 208599 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.191 208599 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.191 208599 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208599
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.292 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[54d52107-cffc-4c33-8031-2f6743e05070]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.297 187082 INFO nova.compute.manager [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Took 9.18 seconds to build instance.
Nov 24 13:17:46 compute-1 nova_compute[187078]: 2025-11-24 13:17:46.315 187082 DEBUG oslo_concurrency.lockutils [None req-fa40d809-a056-4b51-87b7-8c9a49f707dd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.767 208599 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.767 208599 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:46.767 208599 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:47 compute-1 nova_compute[187078]: 2025-11-24 13:17:47.146 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.303 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a81612-329a-494d-bdcb-54fa2c8761a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.304 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap173735b5-01 in ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:17:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.307 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap173735b5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:17:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.307 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[25ca9d67-0b53-42ce-9cb0-5373320bb3ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.311 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7d0b72-d91b-4513-ab1b-a9a1de470a6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.333 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[4aef2416-9594-4b27-b114-c3580ebf55f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.356 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0d232f80-bec4-4462-94b0-2a69a2c7409f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.358 104225 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpz1xwdfps/privsep.sock']
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:48.028 104225 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:48.029 104225 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpz1xwdfps/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.892 208613 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.897 208613 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.899 208613 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:47.899 208613 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208613
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:48.034 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[1786132b-cf92-4d63-b437-838b8373a51f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:48 compute-1 nova_compute[187078]: 2025-11-24 13:17:48.114 187082 DEBUG nova.compute.manager [req-c66bf585-f9fa-4a84-8510-4053e5bf33c4 req-8bd15cbb-675e-4ea8-8c4a-95f31e0c5171 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:17:48 compute-1 nova_compute[187078]: 2025-11-24 13:17:48.114 187082 DEBUG oslo_concurrency.lockutils [req-c66bf585-f9fa-4a84-8510-4053e5bf33c4 req-8bd15cbb-675e-4ea8-8c4a-95f31e0c5171 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:48 compute-1 nova_compute[187078]: 2025-11-24 13:17:48.115 187082 DEBUG oslo_concurrency.lockutils [req-c66bf585-f9fa-4a84-8510-4053e5bf33c4 req-8bd15cbb-675e-4ea8-8c4a-95f31e0c5171 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:48 compute-1 nova_compute[187078]: 2025-11-24 13:17:48.115 187082 DEBUG oslo_concurrency.lockutils [req-c66bf585-f9fa-4a84-8510-4053e5bf33c4 req-8bd15cbb-675e-4ea8-8c4a-95f31e0c5171 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:48 compute-1 nova_compute[187078]: 2025-11-24 13:17:48.115 187082 DEBUG nova.compute.manager [req-c66bf585-f9fa-4a84-8510-4053e5bf33c4 req-8bd15cbb-675e-4ea8-8c4a-95f31e0c5171 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] No waiting events found dispatching network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:17:48 compute-1 nova_compute[187078]: 2025-11-24 13:17:48.116 187082 WARNING nova.compute.manager [req-c66bf585-f9fa-4a84-8510-4053e5bf33c4 req-8bd15cbb-675e-4ea8-8c4a-95f31e0c5171 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received unexpected event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for instance with vm_state active and task_state None.
Nov 24 13:17:48 compute-1 podman[208618]: 2025-11-24 13:17:48.516373284 +0000 UTC m=+0.044894991 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:48.525 208613 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:48.525 208613 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:17:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:48.525 208613 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.096 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d93ce2-acc8-41b5-9d10-26734131fc74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 NetworkManager[55527]: <info>  [1763990269.1264] manager: (tap173735b5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.124 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[733f2e58-31ce-4d22-bacc-abb5fd6741ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 systemd-udevd[208648]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.173 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[31885cd7-494f-4105-8305-fd720be4b777]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.176 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4c93fe-7a79-4660-b858-82cf77b327a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 NetworkManager[55527]: <info>  [1763990269.1951] device (tap173735b5-00): carrier: link connected
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.199 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[935ec266-96fa-4bfb-8fb2-c43519da082c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.216 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[52e84f83-c3f8-4613-af21-859a28a72749]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 328785, 'reachable_time': 39725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208667, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.230 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e014dd-0ba1-4fc7-a941-89ab16e20b70]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:aa6e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 328785, 'tstamp': 328785}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208668, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.244 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[40ee4452-c330-481b-8b5d-2dc94174cb33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 328785, 'reachable_time': 39725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208669, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.273 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[c88bf094-9b3b-4c8d-a142-1232496d4e3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.331 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[debb7a40-23b6-40a1-be44-e758960c6ee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.334 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.335 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.335 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:17:49 compute-1 nova_compute[187078]: 2025-11-24 13:17:49.337 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:49 compute-1 NetworkManager[55527]: <info>  [1763990269.3384] manager: (tap173735b5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Nov 24 13:17:49 compute-1 kernel: tap173735b5-00: entered promiscuous mode
Nov 24 13:17:49 compute-1 nova_compute[187078]: 2025-11-24 13:17:49.340 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.341 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:17:49 compute-1 nova_compute[187078]: 2025-11-24 13:17:49.342 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:49 compute-1 ovn_controller[95368]: 2025-11-24T13:17:49Z|00031|binding|INFO|Releasing lport 05d2a163-89ad-4be0-a5cd-d2951a560cf8 from this chassis (sb_readonly=0)
Nov 24 13:17:49 compute-1 nova_compute[187078]: 2025-11-24 13:17:49.353 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.355 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/173735b5-05cb-4490-be96-4caf1fa864d7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/173735b5-05cb-4490-be96-4caf1fa864d7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.356 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4c9d61-90d5-4c02-b3bf-60dabaddb91d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.358 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/173735b5-05cb-4490-be96-4caf1fa864d7.pid.haproxy
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:17:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:17:49.359 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'env', 'PROCESS_TAG=haproxy-173735b5-05cb-4490-be96-4caf1fa864d7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/173735b5-05cb-4490-be96-4caf1fa864d7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:17:49 compute-1 openstack_network_exporter[199599]: ERROR   13:17:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:17:49 compute-1 openstack_network_exporter[199599]: ERROR   13:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:17:49 compute-1 openstack_network_exporter[199599]: ERROR   13:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:17:49 compute-1 openstack_network_exporter[199599]: ERROR   13:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:17:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:17:49 compute-1 openstack_network_exporter[199599]: ERROR   13:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:17:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:17:49 compute-1 nova_compute[187078]: 2025-11-24 13:17:49.677 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:49 compute-1 podman[208702]: 2025-11-24 13:17:49.718294209 +0000 UTC m=+0.052856458 container create 0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:17:49 compute-1 systemd[1]: Started libpod-conmon-0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff.scope.
Nov 24 13:17:49 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:17:49 compute-1 podman[208702]: 2025-11-24 13:17:49.691008797 +0000 UTC m=+0.025571076 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:17:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/224a06762d6f52636ceec6287c2c0f55f8c3a614bba3627e7d1d6f22c0d6770a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:17:49 compute-1 podman[208702]: 2025-11-24 13:17:49.809151007 +0000 UTC m=+0.143713266 container init 0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:17:49 compute-1 podman[208702]: 2025-11-24 13:17:49.814784421 +0000 UTC m=+0.149346670 container start 0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:17:49 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[208717]: [NOTICE]   (208721) : New worker (208723) forked
Nov 24 13:17:49 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[208717]: [NOTICE]   (208721) : Loading success.
Nov 24 13:17:51 compute-1 podman[208732]: 2025-11-24 13:17:51.513731373 +0000 UTC m=+0.057399401 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 24 13:17:51 compute-1 sshd-session[208745]: Invalid user seafile from 85.209.134.43 port 43918
Nov 24 13:17:51 compute-1 sshd-session[208745]: Received disconnect from 85.209.134.43 port 43918:11: Bye Bye [preauth]
Nov 24 13:17:51 compute-1 sshd-session[208745]: Disconnected from invalid user seafile 85.209.134.43 port 43918 [preauth]
Nov 24 13:17:52 compute-1 nova_compute[187078]: 2025-11-24 13:17:52.149 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:54 compute-1 podman[208754]: 2025-11-24 13:17:54.532063771 +0000 UTC m=+0.076884420 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:17:54 compute-1 podman[208755]: 2025-11-24 13:17:54.589050371 +0000 UTC m=+0.121205716 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:17:54 compute-1 nova_compute[187078]: 2025-11-24 13:17:54.680 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:57 compute-1 nova_compute[187078]: 2025-11-24 13:17:57.150 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:17:59 compute-1 ovn_controller[95368]: 2025-11-24T13:17:59Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:10:b3:4c 10.100.0.13
Nov 24 13:17:59 compute-1 ovn_controller[95368]: 2025-11-24T13:17:59Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:b3:4c 10.100.0.13
Nov 24 13:17:59 compute-1 nova_compute[187078]: 2025-11-24 13:17:59.682 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:02 compute-1 nova_compute[187078]: 2025-11-24 13:18:02.152 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:02 compute-1 nova_compute[187078]: 2025-11-24 13:18:02.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:02 compute-1 nova_compute[187078]: 2025-11-24 13:18:02.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:02 compute-1 nova_compute[187078]: 2025-11-24 13:18:02.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:02 compute-1 nova_compute[187078]: 2025-11-24 13:18:02.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:18:03 compute-1 podman[208817]: 2025-11-24 13:18:03.528800438 +0000 UTC m=+0.061393140 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:18:03 compute-1 nova_compute[187078]: 2025-11-24 13:18:03.669 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:04.145 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:04.146 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:04.148 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:04 compute-1 nova_compute[187078]: 2025-11-24 13:18:04.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:04 compute-1 nova_compute[187078]: 2025-11-24 13:18:04.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:18:04 compute-1 nova_compute[187078]: 2025-11-24 13:18:04.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:18:04 compute-1 nova_compute[187078]: 2025-11-24 13:18:04.684 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:05 compute-1 podman[197429]: time="2025-11-24T13:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:18:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:18:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3034 "" "Go-http-client/1.1"
Nov 24 13:18:06 compute-1 nova_compute[187078]: 2025-11-24 13:18:06.866 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:18:06 compute-1 nova_compute[187078]: 2025-11-24 13:18:06.867 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:18:06 compute-1 nova_compute[187078]: 2025-11-24 13:18:06.868 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:18:06 compute-1 nova_compute[187078]: 2025-11-24 13:18:06.868 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:07 compute-1 nova_compute[187078]: 2025-11-24 13:18:07.193 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:09 compute-1 nova_compute[187078]: 2025-11-24 13:18:09.686 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.694 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating instance_info_cache with network_info: [{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.766 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.766 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.767 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.768 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.768 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.789 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.790 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.790 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.791 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.860 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.921 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.922 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:11 compute-1 nova_compute[187078]: 2025-11-24 13:18:11.991 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.173 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.175 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5747MB free_disk=73.43416213989258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.175 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.175 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.196 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.300 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance e949dac9-04e8-4bf5-b73c-32ab3fc59472 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.300 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.301 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.354 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.401 187082 ERROR nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [req-fcca8d9f-f702-40c9-aaa0-e359114ce7b9] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID ece8f004-1d5b-407f-a713-f9e87706b045.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-fcca8d9f-f702-40c9-aaa0-e359114ce7b9"}]}
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.420 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.436 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.436 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.456 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.475 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.522 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.697 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updated inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.697 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.698 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.731 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:18:12 compute-1 nova_compute[187078]: 2025-11-24 13:18:12.732 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:13 compute-1 nova_compute[187078]: 2025-11-24 13:18:13.727 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:13 compute-1 nova_compute[187078]: 2025-11-24 13:18:13.728 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:18:14 compute-1 nova_compute[187078]: 2025-11-24 13:18:14.688 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:17 compute-1 nova_compute[187078]: 2025-11-24 13:18:17.198 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:19 compute-1 openstack_network_exporter[199599]: ERROR   13:18:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:18:19 compute-1 openstack_network_exporter[199599]: ERROR   13:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:18:19 compute-1 openstack_network_exporter[199599]: ERROR   13:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:18:19 compute-1 openstack_network_exporter[199599]: ERROR   13:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:18:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:18:19 compute-1 openstack_network_exporter[199599]: ERROR   13:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:18:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:18:19 compute-1 podman[208846]: 2025-11-24 13:18:19.532378911 +0000 UTC m=+0.068713130 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:18:19 compute-1 nova_compute[187078]: 2025-11-24 13:18:19.692 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.318 187082 DEBUG nova.compute.manager [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.560 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.561 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.588 187082 DEBUG nova.objects.instance [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'pci_requests' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.601 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.602 187082 INFO nova.compute.claims [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.603 187082 DEBUG nova.objects.instance [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'resources' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.612 187082 DEBUG nova.objects.instance [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'pci_devices' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.651 187082 INFO nova.compute.resource_tracker [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating resource usage from migration 900ad185-d5c2-4850-9e2f-f55afb0054ce
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.720 187082 DEBUG nova.compute.provider_tree [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.734 187082 DEBUG nova.scheduler.client.report [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.757 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.758 187082 INFO nova.compute.manager [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Migrating
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.758 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.758 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.764 187082 INFO nova.compute.rpcapi [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.764 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.828 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.829 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:18:20 compute-1 nova_compute[187078]: 2025-11-24 13:18:20.829 187082 DEBUG nova.network.neutron [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:18:22 compute-1 nova_compute[187078]: 2025-11-24 13:18:22.199 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:22 compute-1 podman[208870]: 2025-11-24 13:18:22.525516564 +0000 UTC m=+0.062171041 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:18:24 compute-1 nova_compute[187078]: 2025-11-24 13:18:24.696 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:25 compute-1 podman[208889]: 2025-11-24 13:18:25.583942603 +0000 UTC m=+0.115455809 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:18:25 compute-1 podman[208890]: 2025-11-24 13:18:25.595016536 +0000 UTC m=+0.118894883 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:18:25 compute-1 nova_compute[187078]: 2025-11-24 13:18:25.845 187082 DEBUG nova.network.neutron [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating instance_info_cache with network_info: [{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:18:25 compute-1 nova_compute[187078]: 2025-11-24 13:18:25.857 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:18:25 compute-1 nova_compute[187078]: 2025-11-24 13:18:25.991 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 24 13:18:25 compute-1 nova_compute[187078]: 2025-11-24 13:18:25.996 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 24 13:18:26 compute-1 ovn_controller[95368]: 2025-11-24T13:18:26Z|00032|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 24 13:18:27 compute-1 nova_compute[187078]: 2025-11-24 13:18:27.202 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:28 compute-1 kernel: tap3c9ecd74-5b (unregistering): left promiscuous mode
Nov 24 13:18:28 compute-1 NetworkManager[55527]: <info>  [1763990308.2453] device (tap3c9ecd74-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:18:28 compute-1 ovn_controller[95368]: 2025-11-24T13:18:28Z|00033|binding|INFO|Releasing lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e from this chassis (sb_readonly=0)
Nov 24 13:18:28 compute-1 ovn_controller[95368]: 2025-11-24T13:18:28Z|00034|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e down in Southbound
Nov 24 13:18:28 compute-1 ovn_controller[95368]: 2025-11-24T13:18:28Z|00035|binding|INFO|Removing iface tap3c9ecd74-5b ovn-installed in OVS
Nov 24 13:18:28 compute-1 nova_compute[187078]: 2025-11-24 13:18:28.252 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:28 compute-1 nova_compute[187078]: 2025-11-24 13:18:28.254 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:28 compute-1 nova_compute[187078]: 2025-11-24 13:18:28.275 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.278 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:b3:4c 10.100.0.13'], port_security=['fa:16:3e:10:b3:4c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e949dac9-04e8-4bf5-b73c-32ab3fc59472', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.280 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 unbound from our chassis
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.281 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 173735b5-05cb-4490-be96-4caf1fa864d7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.283 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[fd443125-4987-48a4-af73-e5047059f3d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.283 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 namespace which is not needed anymore
Nov 24 13:18:28 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 24 13:18:28 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 14.731s CPU time.
Nov 24 13:18:28 compute-1 systemd-machined[153355]: Machine qemu-1-instance-00000001 terminated.
Nov 24 13:18:28 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[208717]: [NOTICE]   (208721) : haproxy version is 2.8.14-c23fe91
Nov 24 13:18:28 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[208717]: [NOTICE]   (208721) : path to executable is /usr/sbin/haproxy
Nov 24 13:18:28 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[208717]: [WARNING]  (208721) : Exiting Master process...
Nov 24 13:18:28 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[208717]: [ALERT]    (208721) : Current worker (208723) exited with code 143 (Terminated)
Nov 24 13:18:28 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[208717]: [WARNING]  (208721) : All workers exited. Exiting... (0)
Nov 24 13:18:28 compute-1 systemd[1]: libpod-0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff.scope: Deactivated successfully.
Nov 24 13:18:28 compute-1 podman[208965]: 2025-11-24 13:18:28.452922539 +0000 UTC m=+0.058290006 container died 0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 24 13:18:28 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff-userdata-shm.mount: Deactivated successfully.
Nov 24 13:18:28 compute-1 systemd[1]: var-lib-containers-storage-overlay-224a06762d6f52636ceec6287c2c0f55f8c3a614bba3627e7d1d6f22c0d6770a-merged.mount: Deactivated successfully.
Nov 24 13:18:28 compute-1 podman[208965]: 2025-11-24 13:18:28.501962579 +0000 UTC m=+0.107330036 container cleanup 0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 13:18:28 compute-1 systemd[1]: libpod-conmon-0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff.scope: Deactivated successfully.
Nov 24 13:18:28 compute-1 podman[209012]: 2025-11-24 13:18:28.572989993 +0000 UTC m=+0.047676275 container remove 0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.579 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c5ce47-36f6-4530-8edd-ab830da572a5]: (4, ('Mon Nov 24 01:18:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 (0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff)\n0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff\nMon Nov 24 01:18:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 (0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff)\n0976dfd1c788486e8f28fa082741ac0765f99ac199d01fdea4bd606f7049b7ff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.582 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee2a944-f4ca-40b5-9ef1-d5efd6081030]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.584 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:28 compute-1 sshd-session[208938]: Invalid user daniel from 5.198.176.28 port 43494
Nov 24 13:18:28 compute-1 nova_compute[187078]: 2025-11-24 13:18:28.588 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:28 compute-1 kernel: tap173735b5-00: left promiscuous mode
Nov 24 13:18:28 compute-1 nova_compute[187078]: 2025-11-24 13:18:28.609 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.613 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2d53130d-e2e5-485a-879a-165620af9eec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.630 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[75e3c0cb-d78f-4e03-a625-a203b3037021]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.631 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[336b5734-8f83-44bd-9837-889ffbe5b967]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.649 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[00a4cb62-9b9a-49a2-a613-de7764bc2dda]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 328774, 'reachable_time': 34998, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209034, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:28 compute-1 systemd[1]: run-netns-ovnmeta\x2d173735b5\x2d05cb\x2d4490\x2dbe96\x2d4caf1fa864d7.mount: Deactivated successfully.
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.662 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:18:28 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:28.663 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[f5dad0db-f231-4fbd-ab12-5637a5d9f3c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:28 compute-1 sshd-session[208938]: Received disconnect from 5.198.176.28 port 43494:11: Bye Bye [preauth]
Nov 24 13:18:28 compute-1 sshd-session[208938]: Disconnected from invalid user daniel 5.198.176.28 port 43494 [preauth]
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.018 187082 INFO nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance shutdown successfully after 3 seconds.
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.028 187082 INFO nova.virt.libvirt.driver [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance destroyed successfully.
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.030 187082 DEBUG nova.virt.libvirt.vif [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1684048061',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1684048061',id=1,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:17:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-yt51c9yn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:18:20Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=e949dac9-04e8-4bf5-b73c-32ab3fc59472,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:10:b3:4c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.030 187082 DEBUG nova.network.os_vif_util [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:10:b3:4c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.032 187082 DEBUG nova.network.os_vif_util [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.033 187082 DEBUG os_vif [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.036 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.036 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c9ecd74-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.039 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.041 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.045 187082 INFO os_vif [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b')
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.052 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.118 187082 DEBUG nova.compute.manager [req-167d4986-2fef-43ff-8630-94f5b557b068 req-82ab8901-ccc0-47c7-a8f2-fcca791890a2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-unplugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.119 187082 DEBUG oslo_concurrency.lockutils [req-167d4986-2fef-43ff-8630-94f5b557b068 req-82ab8901-ccc0-47c7-a8f2-fcca791890a2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.120 187082 DEBUG oslo_concurrency.lockutils [req-167d4986-2fef-43ff-8630-94f5b557b068 req-82ab8901-ccc0-47c7-a8f2-fcca791890a2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.120 187082 DEBUG oslo_concurrency.lockutils [req-167d4986-2fef-43ff-8630-94f5b557b068 req-82ab8901-ccc0-47c7-a8f2-fcca791890a2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.120 187082 DEBUG nova.compute.manager [req-167d4986-2fef-43ff-8630-94f5b557b068 req-82ab8901-ccc0-47c7-a8f2-fcca791890a2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] No waiting events found dispatching network-vif-unplugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.121 187082 WARNING nova.compute.manager [req-167d4986-2fef-43ff-8630-94f5b557b068 req-82ab8901-ccc0-47c7-a8f2-fcca791890a2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received unexpected event network-vif-unplugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for instance with vm_state active and task_state resize_migrating.
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.123 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.123 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.181 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.184 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472_resize/disk /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:29 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:29.235 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:18:29 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:29.236 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.238 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.239 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "cp -r /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472_resize/disk /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.240 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472_resize/disk.config /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.270 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "cp -r /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472_resize/disk.config /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.271 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472_resize/disk.info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.301 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "cp -r /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472_resize/disk.info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.info" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.685 187082 DEBUG nova.network.neutron [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.784 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.785 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.786 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.945 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.946 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:18:29 compute-1 nova_compute[187078]: 2025-11-24 13:18:29.946 187082 DEBUG nova.network.neutron [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.054 187082 DEBUG nova.network.neutron [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating instance_info_cache with network_info: [{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.079 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.187 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.188 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.188 187082 INFO nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Creating image(s)
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.190 187082 DEBUG nova.objects.instance [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.197 187082 DEBUG nova.compute.manager [req-5a812fe4-24ef-4146-ab56-8166b101783e req-beea4f90-0780-4490-b7a5-f22f40610c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.197 187082 DEBUG oslo_concurrency.lockutils [req-5a812fe4-24ef-4146-ab56-8166b101783e req-beea4f90-0780-4490-b7a5-f22f40610c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.198 187082 DEBUG oslo_concurrency.lockutils [req-5a812fe4-24ef-4146-ab56-8166b101783e req-beea4f90-0780-4490-b7a5-f22f40610c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.198 187082 DEBUG oslo_concurrency.lockutils [req-5a812fe4-24ef-4146-ab56-8166b101783e req-beea4f90-0780-4490-b7a5-f22f40610c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.198 187082 DEBUG nova.compute.manager [req-5a812fe4-24ef-4146-ab56-8166b101783e req-beea4f90-0780-4490-b7a5-f22f40610c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] No waiting events found dispatching network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.198 187082 WARNING nova.compute.manager [req-5a812fe4-24ef-4146-ab56-8166b101783e req-beea4f90-0780-4490-b7a5-f22f40610c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received unexpected event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for instance with vm_state active and task_state resize_finish.
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.207 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.261 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.262 187082 DEBUG nova.virt.disk.api [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Checking if we can resize image /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.262 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.358 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.359 187082 DEBUG nova.virt.disk.api [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Cannot resize image /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.374 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.375 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Ensure instance console log exists: /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.375 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.376 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.376 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.379 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Start _get_guest_xml network_info=[{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:10:b3:4c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.384 187082 WARNING nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.393 187082 DEBUG nova.virt.libvirt.host [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.394 187082 DEBUG nova.virt.libvirt.host [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.398 187082 DEBUG nova.virt.libvirt.host [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.399 187082 DEBUG nova.virt.libvirt.host [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.401 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.401 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='00785036-73c3-4202-80be-2dc06466a80f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.401 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.402 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.402 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.402 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.402 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.402 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.403 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.403 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.403 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.403 187082 DEBUG nova.virt.hardware [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.403 187082 DEBUG nova.objects.instance [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'vcpu_model' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.423 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.483 187082 DEBUG oslo_concurrency.processutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.484 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.485 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.486 187082 DEBUG oslo_concurrency.lockutils [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.487 187082 DEBUG nova.virt.libvirt.vif [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1684048061',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1684048061',id=1,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:17:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-yt51c9yn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:18:29Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=e949dac9-04e8-4bf5-b73c-32ab3fc59472,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:10:b3:4c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.487 187082 DEBUG nova.network.os_vif_util [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:10:b3:4c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.488 187082 DEBUG nova.network.os_vif_util [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.491 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <uuid>e949dac9-04e8-4bf5-b73c-32ab3fc59472</uuid>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <name>instance-00000001</name>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <memory>196608</memory>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1684048061</nova:name>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:18:31</nova:creationTime>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <nova:flavor name="m1.micro">
Nov 24 13:18:31 compute-1 nova_compute[187078]:         <nova:memory>192</nova:memory>
Nov 24 13:18:31 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:18:31 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:18:31 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:18:31 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:18:31 compute-1 nova_compute[187078]:         <nova:user uuid="bad71b4865594b828cc87e37a3107bc4">tempest-TestExecuteActionsViaActuator-1613857129-project-member</nova:user>
Nov 24 13:18:31 compute-1 nova_compute[187078]:         <nova:project uuid="5383ea8abbd144f89d959d9b1f9c052f">tempest-TestExecuteActionsViaActuator-1613857129</nova:project>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:18:31 compute-1 nova_compute[187078]:         <nova:port uuid="3c9ecd74-5bbd-4ab3-ad59-929239c5a81e">
Nov 24 13:18:31 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <system>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <entry name="serial">e949dac9-04e8-4bf5-b73c-32ab3fc59472</entry>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <entry name="uuid">e949dac9-04e8-4bf5-b73c-32ab3fc59472</entry>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </system>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <os>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   </os>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <features>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   </features>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk.config"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:10:b3:4c"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <target dev="tap3c9ecd74-5b"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/console.log" append="off"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <video>
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </video>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:18:31 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:18:31 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:18:31 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:18:31 compute-1 nova_compute[187078]: </domain>
Nov 24 13:18:31 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.493 187082 DEBUG nova.virt.libvirt.vif [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1684048061',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1684048061',id=1,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:17:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-yt51c9yn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:18:29Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=e949dac9-04e8-4bf5-b73c-32ab3fc59472,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:10:b3:4c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.493 187082 DEBUG nova.network.os_vif_util [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:10:b3:4c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.493 187082 DEBUG nova.network.os_vif_util [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.494 187082 DEBUG os_vif [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.494 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.495 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.496 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.499 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.500 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c9ecd74-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.500 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c9ecd74-5b, col_values=(('external_ids', {'iface-id': '3c9ecd74-5bbd-4ab3-ad59-929239c5a81e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:b3:4c', 'vm-uuid': 'e949dac9-04e8-4bf5-b73c-32ab3fc59472'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.503 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:31 compute-1 NetworkManager[55527]: <info>  [1763990311.5050] manager: (tap3c9ecd74-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.506 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.511 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.511 187082 INFO os_vif [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b')
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.573 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.575 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.575 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No VIF found with MAC fa:16:3e:10:b3:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.576 187082 INFO nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Using config drive
Nov 24 13:18:31 compute-1 NetworkManager[55527]: <info>  [1763990311.6616] manager: (tap3c9ecd74-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Nov 24 13:18:31 compute-1 kernel: tap3c9ecd74-5b: entered promiscuous mode
Nov 24 13:18:31 compute-1 ovn_controller[95368]: 2025-11-24T13:18:31Z|00036|binding|INFO|Claiming lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for this chassis.
Nov 24 13:18:31 compute-1 ovn_controller[95368]: 2025-11-24T13:18:31Z|00037|binding|INFO|3c9ecd74-5bbd-4ab3-ad59-929239c5a81e: Claiming fa:16:3e:10:b3:4c 10.100.0.13
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.665 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.674 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:b3:4c 10.100.0.13'], port_security=['fa:16:3e:10:b3:4c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e949dac9-04e8-4bf5-b73c-32ab3fc59472', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.676 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 bound to our chassis
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.678 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:18:31 compute-1 ovn_controller[95368]: 2025-11-24T13:18:31Z|00038|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e ovn-installed in OVS
Nov 24 13:18:31 compute-1 ovn_controller[95368]: 2025-11-24T13:18:31Z|00039|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e up in Southbound
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.685 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:31 compute-1 nova_compute[187078]: 2025-11-24 13:18:31.688 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.700 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[14f08471-78de-4390-9e77-98a2debdd203]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.701 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap173735b5-01 in ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:18:31 compute-1 systemd-udevd[209071]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.703 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap173735b5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.703 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[1e5700ed-66fe-43c8-bda2-6bd9e26e38f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.705 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3869da1b-3116-427f-8a08-f7d72c73fc19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 systemd-machined[153355]: New machine qemu-2-instance-00000001.
Nov 24 13:18:31 compute-1 NetworkManager[55527]: <info>  [1763990311.7209] device (tap3c9ecd74-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:18:31 compute-1 NetworkManager[55527]: <info>  [1763990311.7220] device (tap3c9ecd74-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:18:31 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000001.
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.723 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[6e683cca-220e-4614-81ec-37ff47c5e411]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.754 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c04b40-6678-454f-93ae-45cc67dba10d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.801 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[ef936ad0-a0a2-449d-b56d-f0accce23671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.806 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4a73cd-0b8b-4cb8-b9d5-ea58b5f108f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 NetworkManager[55527]: <info>  [1763990311.8093] manager: (tap173735b5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.857 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[ee5737ba-98ef-4371-80a8-09595f24e0ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.862 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[5e344765-00a1-4373-897d-84865cef73b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 NetworkManager[55527]: <info>  [1763990311.8991] device (tap173735b5-00): carrier: link connected
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.907 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[9e0a5d81-7816-4474-98b1-793b6833407f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.939 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5491dd-8380-4430-86f0-d562d72d18ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209105, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.962 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[adefbc58-bd97-4b19-8d37-630b7d45f45a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:aa6e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333055, 'tstamp': 333055}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209106, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:31 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:31.987 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2e9694b2-ac01-47e3-a0db-e67eeff33543]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209107, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.037 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9c6f2c43-f6d9-49a5-975c-eb7725272ef5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.128 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e08f2d69-70f3-4492-a9c6-3c32f9ce5f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.131 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.132 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.133 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.136 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:32 compute-1 NetworkManager[55527]: <info>  [1763990312.1379] manager: (tap173735b5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 24 13:18:32 compute-1 kernel: tap173735b5-00: entered promiscuous mode
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.143 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.145 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.147 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:32 compute-1 ovn_controller[95368]: 2025-11-24T13:18:32Z|00040|binding|INFO|Releasing lport 05d2a163-89ad-4be0-a5cd-d2951a560cf8 from this chassis (sb_readonly=0)
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.173 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.175 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/173735b5-05cb-4490-be96-4caf1fa864d7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/173735b5-05cb-4490-be96-4caf1fa864d7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.176 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5cfe1b-cdac-450c-b793-90e6e5c6c72a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.177 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/173735b5-05cb-4490-be96-4caf1fa864d7.pid.haproxy
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:18:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:32.178 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'env', 'PROCESS_TAG=haproxy-173735b5-05cb-4490-be96-4caf1fa864d7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/173735b5-05cb-4490-be96-4caf1fa864d7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.205 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.305 187082 DEBUG nova.virt.libvirt.host [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Removed pending event for e949dac9-04e8-4bf5-b73c-32ab3fc59472 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.306 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990312.3048835, e949dac9-04e8-4bf5-b73c-32ab3fc59472 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.306 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] VM Resumed (Lifecycle Event)
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.310 187082 DEBUG nova.compute.manager [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.316 187082 INFO nova.virt.libvirt.driver [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance running successfully.
Nov 24 13:18:32 compute-1 virtqemud[186628]: argument unsupported: QEMU guest agent is not configured
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.321 187082 DEBUG nova.virt.libvirt.guest [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.321 187082 DEBUG nova.virt.libvirt.driver [None req-a0173b39-861e-49a7-8250-dbe2808c3ae8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.326 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.331 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.347 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.348 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990312.3065586, e949dac9-04e8-4bf5-b73c-32ab3fc59472 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.348 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] VM Started (Lifecycle Event)
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.378 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:18:32 compute-1 nova_compute[187078]: 2025-11-24 13:18:32.384 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:18:32 compute-1 podman[209146]: 2025-11-24 13:18:32.693498309 +0000 UTC m=+0.074523880 container create 77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 24 13:18:32 compute-1 systemd[1]: Started libpod-conmon-77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4.scope.
Nov 24 13:18:32 compute-1 podman[209146]: 2025-11-24 13:18:32.656481976 +0000 UTC m=+0.037507607 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:18:32 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:18:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be0d56d3b480e174b2c1a9cbe335fc80027f5f897ea4c68e39977dbe6921c3af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:18:32 compute-1 podman[209146]: 2025-11-24 13:18:32.790393359 +0000 UTC m=+0.171418900 container init 77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 24 13:18:32 compute-1 podman[209146]: 2025-11-24 13:18:32.795404016 +0000 UTC m=+0.176429557 container start 77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:18:32 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[209159]: [NOTICE]   (209163) : New worker (209165) forked
Nov 24 13:18:32 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[209159]: [NOTICE]   (209163) : Loading success.
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.285 187082 DEBUG nova.compute.manager [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.286 187082 DEBUG oslo_concurrency.lockutils [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.286 187082 DEBUG oslo_concurrency.lockutils [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.286 187082 DEBUG oslo_concurrency.lockutils [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.286 187082 DEBUG nova.compute.manager [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] No waiting events found dispatching network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.286 187082 WARNING nova.compute.manager [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received unexpected event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for instance with vm_state resized and task_state None.
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.287 187082 DEBUG nova.compute.manager [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.287 187082 DEBUG oslo_concurrency.lockutils [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.287 187082 DEBUG oslo_concurrency.lockutils [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.287 187082 DEBUG oslo_concurrency.lockutils [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.287 187082 DEBUG nova.compute.manager [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] No waiting events found dispatching network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.288 187082 WARNING nova.compute.manager [req-2c384532-3fdf-421e-8254-161ff39c2f29 req-944b0658-0bc0-4214-8689-80ac0a542892 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received unexpected event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for instance with vm_state resized and task_state None.
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.906 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.906 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:33 compute-1 nova_compute[187078]: 2025-11-24 13:18:33.906 187082 DEBUG nova.compute.manager [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Nov 24 13:18:34 compute-1 podman[209176]: 2025-11-24 13:18:34.505434915 +0000 UTC m=+0.052423434 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm)
Nov 24 13:18:34 compute-1 nova_compute[187078]: 2025-11-24 13:18:34.832 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:18:34 compute-1 nova_compute[187078]: 2025-11-24 13:18:34.833 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:18:34 compute-1 nova_compute[187078]: 2025-11-24 13:18:34.833 187082 DEBUG nova.network.neutron [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:18:34 compute-1 nova_compute[187078]: 2025-11-24 13:18:34.833 187082 DEBUG nova.objects.instance [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'info_cache' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:35 compute-1 podman[197429]: time="2025-11-24T13:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:18:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:18:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3042 "" "Go-http-client/1.1"
Nov 24 13:18:36 compute-1 nova_compute[187078]: 2025-11-24 13:18:36.504 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:37 compute-1 nova_compute[187078]: 2025-11-24 13:18:37.208 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:37 compute-1 nova_compute[187078]: 2025-11-24 13:18:37.869 187082 DEBUG nova.network.neutron [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating instance_info_cache with network_info: [{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:18:37 compute-1 nova_compute[187078]: 2025-11-24 13:18:37.894 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:18:37 compute-1 nova_compute[187078]: 2025-11-24 13:18:37.895 187082 DEBUG nova.objects.instance [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.023 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.024 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.112 187082 DEBUG nova.compute.provider_tree [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.123 187082 DEBUG nova.scheduler.client.report [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.163 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.281 187082 INFO nova.scheduler.client.report [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration 900ad185-d5c2-4850-9e2f-f55afb0054ce
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.322 187082 DEBUG oslo_concurrency.lockutils [None req-cdac2257-37f0-41b1-861d-cb7eb0d2b410 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.505 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.505 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.516 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.566 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.567 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.574 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.574 187082 INFO nova.compute.claims [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.691 187082 DEBUG nova.compute.provider_tree [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.703 187082 DEBUG nova.scheduler.client.report [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.721 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.722 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.763 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.763 187082 DEBUG nova.network.neutron [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.781 187082 INFO nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.794 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.870 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.871 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.872 187082 INFO nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Creating image(s)
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.872 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "/var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.872 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "/var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.873 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "/var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.884 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.959 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.960 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.961 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:38 compute-1 nova_compute[187078]: 2025-11-24 13:18:38.976 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.028 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.029 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.062 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.063 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.063 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.116 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.117 187082 DEBUG nova.virt.disk.api [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Checking if we can resize image /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.118 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.178 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.179 187082 DEBUG nova.virt.disk.api [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Cannot resize image /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.179 187082 DEBUG nova.objects.instance [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'migration_context' on Instance uuid bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.194 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.194 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Ensure instance console log exists: /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.195 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.195 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.196 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:39 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:39.239 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:39 compute-1 nova_compute[187078]: 2025-11-24 13:18:39.684 187082 DEBUG nova.policy [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bad71b4865594b828cc87e37a3107bc4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:18:40 compute-1 nova_compute[187078]: 2025-11-24 13:18:40.895 187082 DEBUG nova.network.neutron [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Successfully created port: ced2619b-6589-4d8e-be2b-6abff02aa6a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:18:41 compute-1 nova_compute[187078]: 2025-11-24 13:18:41.507 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:42 compute-1 nova_compute[187078]: 2025-11-24 13:18:42.208 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:43 compute-1 nova_compute[187078]: 2025-11-24 13:18:43.838 187082 DEBUG nova.network.neutron [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Successfully updated port: ced2619b-6589-4d8e-be2b-6abff02aa6a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:18:43 compute-1 nova_compute[187078]: 2025-11-24 13:18:43.864 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:18:43 compute-1 nova_compute[187078]: 2025-11-24 13:18:43.864 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquired lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:18:43 compute-1 nova_compute[187078]: 2025-11-24 13:18:43.864 187082 DEBUG nova.network.neutron [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:18:43 compute-1 nova_compute[187078]: 2025-11-24 13:18:43.923 187082 DEBUG nova.compute.manager [req-23bce579-18cf-41f3-9a20-973f58fdb412 req-0acd95ec-b3f3-401d-8457-1f5c0e4000db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-changed-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:18:43 compute-1 nova_compute[187078]: 2025-11-24 13:18:43.924 187082 DEBUG nova.compute.manager [req-23bce579-18cf-41f3-9a20-973f58fdb412 req-0acd95ec-b3f3-401d-8457-1f5c0e4000db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Refreshing instance network info cache due to event network-changed-ced2619b-6589-4d8e-be2b-6abff02aa6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:18:43 compute-1 nova_compute[187078]: 2025-11-24 13:18:43.924 187082 DEBUG oslo_concurrency.lockutils [req-23bce579-18cf-41f3-9a20-973f58fdb412 req-0acd95ec-b3f3-401d-8457-1f5c0e4000db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:18:43 compute-1 nova_compute[187078]: 2025-11-24 13:18:43.978 187082 DEBUG nova.network.neutron [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:18:45 compute-1 ovn_controller[95368]: 2025-11-24T13:18:45Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:b3:4c 10.100.0.13
Nov 24 13:18:45 compute-1 nova_compute[187078]: 2025-11-24 13:18:45.988 187082 DEBUG nova.network.neutron [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Updating instance_info_cache with network_info: [{"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.013 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Releasing lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.013 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Instance network_info: |[{"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.014 187082 DEBUG oslo_concurrency.lockutils [req-23bce579-18cf-41f3-9a20-973f58fdb412 req-0acd95ec-b3f3-401d-8457-1f5c0e4000db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.014 187082 DEBUG nova.network.neutron [req-23bce579-18cf-41f3-9a20-973f58fdb412 req-0acd95ec-b3f3-401d-8457-1f5c0e4000db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Refreshing network info cache for port ced2619b-6589-4d8e-be2b-6abff02aa6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.017 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Start _get_guest_xml network_info=[{"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.022 187082 WARNING nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.028 187082 DEBUG nova.virt.libvirt.host [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.029 187082 DEBUG nova.virt.libvirt.host [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.032 187082 DEBUG nova.virt.libvirt.host [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.033 187082 DEBUG nova.virt.libvirt.host [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.034 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.034 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.035 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.035 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.035 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.035 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.036 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.036 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.036 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.036 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.037 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.037 187082 DEBUG nova.virt.hardware [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.041 187082 DEBUG nova.virt.libvirt.vif [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:18:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-132711386',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-132711386',id=3,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-p4tqeznp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:18:38Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.041 187082 DEBUG nova.network.os_vif_util [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.042 187082 DEBUG nova.network.os_vif_util [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:91:48,bridge_name='br-int',has_traffic_filtering=True,id=ced2619b-6589-4d8e-be2b-6abff02aa6a5,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapced2619b-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.043 187082 DEBUG nova.objects.instance [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'pci_devices' on Instance uuid bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.053 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <uuid>bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3</uuid>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <name>instance-00000003</name>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-132711386</nova:name>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:18:46</nova:creationTime>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:18:46 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:18:46 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:18:46 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:18:46 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:18:46 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:18:46 compute-1 nova_compute[187078]:         <nova:user uuid="bad71b4865594b828cc87e37a3107bc4">tempest-TestExecuteActionsViaActuator-1613857129-project-member</nova:user>
Nov 24 13:18:46 compute-1 nova_compute[187078]:         <nova:project uuid="5383ea8abbd144f89d959d9b1f9c052f">tempest-TestExecuteActionsViaActuator-1613857129</nova:project>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:18:46 compute-1 nova_compute[187078]:         <nova:port uuid="ced2619b-6589-4d8e-be2b-6abff02aa6a5">
Nov 24 13:18:46 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <system>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <entry name="serial">bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3</entry>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <entry name="uuid">bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3</entry>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </system>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <os>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   </os>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <features>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   </features>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk.config"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:88:91:48"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <target dev="tapced2619b-65"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/console.log" append="off"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <video>
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </video>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:18:46 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:18:46 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:18:46 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:18:46 compute-1 nova_compute[187078]: </domain>
Nov 24 13:18:46 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.054 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Preparing to wait for external event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.055 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.055 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.055 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.056 187082 DEBUG nova.virt.libvirt.vif [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:18:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-132711386',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-132711386',id=3,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-p4tqeznp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:18:38Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.056 187082 DEBUG nova.network.os_vif_util [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.056 187082 DEBUG nova.network.os_vif_util [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:91:48,bridge_name='br-int',has_traffic_filtering=True,id=ced2619b-6589-4d8e-be2b-6abff02aa6a5,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapced2619b-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.057 187082 DEBUG os_vif [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:91:48,bridge_name='br-int',has_traffic_filtering=True,id=ced2619b-6589-4d8e-be2b-6abff02aa6a5,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapced2619b-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.057 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.057 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.058 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.060 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.060 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapced2619b-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.061 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapced2619b-65, col_values=(('external_ids', {'iface-id': 'ced2619b-6589-4d8e-be2b-6abff02aa6a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:91:48', 'vm-uuid': 'bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.062 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:46 compute-1 NetworkManager[55527]: <info>  [1763990326.0642] manager: (tapced2619b-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.065 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.068 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.069 187082 INFO os_vif [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:91:48,bridge_name='br-int',has_traffic_filtering=True,id=ced2619b-6589-4d8e-be2b-6abff02aa6a5,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapced2619b-65')
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.115 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.115 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.116 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No VIF found with MAC fa:16:3e:88:91:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.116 187082 INFO nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Using config drive
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.939 187082 INFO nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Creating config drive at /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk.config
Nov 24 13:18:46 compute-1 nova_compute[187078]: 2025-11-24 13:18:46.945 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnamshknu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:18:47 compute-1 nova_compute[187078]: 2025-11-24 13:18:47.068 187082 DEBUG oslo_concurrency.processutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnamshknu" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:18:47 compute-1 NetworkManager[55527]: <info>  [1763990327.1189] manager: (tapced2619b-65): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Nov 24 13:18:47 compute-1 kernel: tapced2619b-65: entered promiscuous mode
Nov 24 13:18:47 compute-1 ovn_controller[95368]: 2025-11-24T13:18:47Z|00041|binding|INFO|Claiming lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 for this chassis.
Nov 24 13:18:47 compute-1 ovn_controller[95368]: 2025-11-24T13:18:47Z|00042|binding|INFO|ced2619b-6589-4d8e-be2b-6abff02aa6a5: Claiming fa:16:3e:88:91:48 10.100.0.11
Nov 24 13:18:47 compute-1 nova_compute[187078]: 2025-11-24 13:18:47.122 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:47 compute-1 ovn_controller[95368]: 2025-11-24T13:18:47Z|00043|binding|INFO|Setting lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 ovn-installed in OVS
Nov 24 13:18:47 compute-1 nova_compute[187078]: 2025-11-24 13:18:47.139 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:47 compute-1 nova_compute[187078]: 2025-11-24 13:18:47.142 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:47 compute-1 ovn_controller[95368]: 2025-11-24T13:18:47Z|00044|binding|INFO|Setting lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 up in Southbound
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.145 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:91:48 10.100.0.11'], port_security=['fa:16:3e:88:91:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=ced2619b-6589-4d8e-be2b-6abff02aa6a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.146 104225 INFO neutron.agent.ovn.metadata.agent [-] Port ced2619b-6589-4d8e-be2b-6abff02aa6a5 in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 bound to our chassis
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.148 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:18:47 compute-1 systemd-udevd[209242]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:18:47 compute-1 systemd-machined[153355]: New machine qemu-3-instance-00000003.
Nov 24 13:18:47 compute-1 NetworkManager[55527]: <info>  [1763990327.1621] device (tapced2619b-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:18:47 compute-1 NetworkManager[55527]: <info>  [1763990327.1634] device (tapced2619b-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.162 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f0337295-700d-4d0b-98f4-e7cbbd9a27b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:47 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.185 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[17e09c98-9377-4633-a7de-1ae894e0b34c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.188 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee460af-c258-455a-b4c4-91b157fac991]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:47 compute-1 nova_compute[187078]: 2025-11-24 13:18:47.209 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.212 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9621ca-3eae-41c8-be43-6014f3a2ae01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.231 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[274067bd-8de1-406b-b823-bfdbcda1afac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209256, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.246 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3add2050-d3d5-41ed-a6d4-121a59cc45fd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333073, 'tstamp': 333073}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209258, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333078, 'tstamp': 333078}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209258, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.248 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:47 compute-1 nova_compute[187078]: 2025-11-24 13:18:47.250 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:47 compute-1 nova_compute[187078]: 2025-11-24 13:18:47.251 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.252 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.252 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.252 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:18:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:18:47.253 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.037 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990328.036484, bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.038 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] VM Started (Lifecycle Event)
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.074 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.078 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990328.036695, bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.078 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] VM Paused (Lifecycle Event)
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.100 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.105 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.135 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.330 187082 DEBUG nova.compute.manager [req-1515f0e1-ef97-4b75-a707-dc02f481277c req-aa3f156f-6a6e-484f-9ae8-d43c72a0b81f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.331 187082 DEBUG oslo_concurrency.lockutils [req-1515f0e1-ef97-4b75-a707-dc02f481277c req-aa3f156f-6a6e-484f-9ae8-d43c72a0b81f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.331 187082 DEBUG oslo_concurrency.lockutils [req-1515f0e1-ef97-4b75-a707-dc02f481277c req-aa3f156f-6a6e-484f-9ae8-d43c72a0b81f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.332 187082 DEBUG oslo_concurrency.lockutils [req-1515f0e1-ef97-4b75-a707-dc02f481277c req-aa3f156f-6a6e-484f-9ae8-d43c72a0b81f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.332 187082 DEBUG nova.compute.manager [req-1515f0e1-ef97-4b75-a707-dc02f481277c req-aa3f156f-6a6e-484f-9ae8-d43c72a0b81f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Processing event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.333 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.336 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990328.3366525, bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.337 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] VM Resumed (Lifecycle Event)
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.339 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.343 187082 INFO nova.virt.libvirt.driver [-] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Instance spawned successfully.
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.344 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.354 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.363 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.369 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.369 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.370 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.370 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.371 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.372 187082 DEBUG nova.virt.libvirt.driver [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.392 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.534 187082 INFO nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Took 9.66 seconds to spawn the instance on the hypervisor.
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.535 187082 DEBUG nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.625 187082 INFO nova.compute.manager [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Took 10.07 seconds to build instance.
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.644 187082 DEBUG oslo_concurrency.lockutils [None req-761bdb13-1bce-4c07-8ebd-a7b563b0571f bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:48 compute-1 sshd-session[209259]: Invalid user ftpuser from 45.78.217.131 port 53456
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.855 187082 DEBUG nova.network.neutron [req-23bce579-18cf-41f3-9a20-973f58fdb412 req-0acd95ec-b3f3-401d-8457-1f5c0e4000db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Updated VIF entry in instance network info cache for port ced2619b-6589-4d8e-be2b-6abff02aa6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.856 187082 DEBUG nova.network.neutron [req-23bce579-18cf-41f3-9a20-973f58fdb412 req-0acd95ec-b3f3-401d-8457-1f5c0e4000db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Updating instance_info_cache with network_info: [{"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:18:48 compute-1 nova_compute[187078]: 2025-11-24 13:18:48.869 187082 DEBUG oslo_concurrency.lockutils [req-23bce579-18cf-41f3-9a20-973f58fdb412 req-0acd95ec-b3f3-401d-8457-1f5c0e4000db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:18:48 compute-1 sshd-session[209259]: Received disconnect from 45.78.217.131 port 53456:11: Bye Bye [preauth]
Nov 24 13:18:48 compute-1 sshd-session[209259]: Disconnected from invalid user ftpuser 45.78.217.131 port 53456 [preauth]
Nov 24 13:18:49 compute-1 openstack_network_exporter[199599]: ERROR   13:18:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:18:49 compute-1 openstack_network_exporter[199599]: ERROR   13:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:18:49 compute-1 openstack_network_exporter[199599]: ERROR   13:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:18:49 compute-1 openstack_network_exporter[199599]: ERROR   13:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:18:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:18:49 compute-1 openstack_network_exporter[199599]: ERROR   13:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:18:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:18:50 compute-1 nova_compute[187078]: 2025-11-24 13:18:50.412 187082 DEBUG nova.compute.manager [req-6306d41c-a17d-488b-90c4-ffffbfa0b63f req-6e17b3da-9325-4c79-814b-8491a233e603 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:18:50 compute-1 nova_compute[187078]: 2025-11-24 13:18:50.412 187082 DEBUG oslo_concurrency.lockutils [req-6306d41c-a17d-488b-90c4-ffffbfa0b63f req-6e17b3da-9325-4c79-814b-8491a233e603 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:18:50 compute-1 nova_compute[187078]: 2025-11-24 13:18:50.412 187082 DEBUG oslo_concurrency.lockutils [req-6306d41c-a17d-488b-90c4-ffffbfa0b63f req-6e17b3da-9325-4c79-814b-8491a233e603 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:18:50 compute-1 nova_compute[187078]: 2025-11-24 13:18:50.413 187082 DEBUG oslo_concurrency.lockutils [req-6306d41c-a17d-488b-90c4-ffffbfa0b63f req-6e17b3da-9325-4c79-814b-8491a233e603 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:18:50 compute-1 nova_compute[187078]: 2025-11-24 13:18:50.413 187082 DEBUG nova.compute.manager [req-6306d41c-a17d-488b-90c4-ffffbfa0b63f req-6e17b3da-9325-4c79-814b-8491a233e603 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:18:50 compute-1 nova_compute[187078]: 2025-11-24 13:18:50.413 187082 WARNING nova.compute.manager [req-6306d41c-a17d-488b-90c4-ffffbfa0b63f req-6e17b3da-9325-4c79-814b-8491a233e603 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state None.
Nov 24 13:18:50 compute-1 podman[209270]: 2025-11-24 13:18:50.530684998 +0000 UTC m=+0.065410770 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:18:51 compute-1 nova_compute[187078]: 2025-11-24 13:18:51.063 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:51 compute-1 sshd-session[209268]: Invalid user zabbix from 68.183.82.237 port 40324
Nov 24 13:18:51 compute-1 sshd-session[209268]: Received disconnect from 68.183.82.237 port 40324:11: Bye Bye [preauth]
Nov 24 13:18:51 compute-1 sshd-session[209268]: Disconnected from invalid user zabbix 68.183.82.237 port 40324 [preauth]
Nov 24 13:18:52 compute-1 nova_compute[187078]: 2025-11-24 13:18:52.212 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:52 compute-1 sshd-session[209294]: Invalid user nginx from 85.209.134.43 port 37228
Nov 24 13:18:52 compute-1 sshd-session[209294]: Received disconnect from 85.209.134.43 port 37228:11: Bye Bye [preauth]
Nov 24 13:18:52 compute-1 sshd-session[209294]: Disconnected from invalid user nginx 85.209.134.43 port 37228 [preauth]
Nov 24 13:18:52 compute-1 podman[209296]: 2025-11-24 13:18:52.940771603 +0000 UTC m=+0.083576117 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:18:53 compute-1 sshd-session[209316]: Invalid user sahil from 176.114.89.34 port 46650
Nov 24 13:18:54 compute-1 sshd-session[209316]: Received disconnect from 176.114.89.34 port 46650:11: Bye Bye [preauth]
Nov 24 13:18:54 compute-1 sshd-session[209316]: Disconnected from invalid user sahil 176.114.89.34 port 46650 [preauth]
Nov 24 13:18:56 compute-1 nova_compute[187078]: 2025-11-24 13:18:56.066 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:18:56 compute-1 podman[209319]: 2025-11-24 13:18:56.528858798 +0000 UTC m=+0.074163818 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:18:56 compute-1 podman[209320]: 2025-11-24 13:18:56.557157022 +0000 UTC m=+0.096730706 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:18:57 compute-1 nova_compute[187078]: 2025-11-24 13:18:57.215 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:01 compute-1 nova_compute[187078]: 2025-11-24 13:19:01.069 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:01 compute-1 sshd-session[209365]: Invalid user sol from 45.148.10.240 port 50668
Nov 24 13:19:01 compute-1 sshd-session[209365]: Connection closed by invalid user sol 45.148.10.240 port 50668 [preauth]
Nov 24 13:19:02 compute-1 nova_compute[187078]: 2025-11-24 13:19:02.216 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:02 compute-1 nova_compute[187078]: 2025-11-24 13:19:02.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:02 compute-1 nova_compute[187078]: 2025-11-24 13:19:02.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:03 compute-1 nova_compute[187078]: 2025-11-24 13:19:03.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:03 compute-1 nova_compute[187078]: 2025-11-24 13:19:03.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:03 compute-1 nova_compute[187078]: 2025-11-24 13:19:03.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:19:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:04.148 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:04.150 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:04.151 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:05 compute-1 podman[209386]: 2025-11-24 13:19:05.525913461 +0000 UTC m=+0.069319091 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Nov 24 13:19:05 compute-1 podman[197429]: time="2025-11-24T13:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:19:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:19:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3047 "" "Go-http-client/1.1"
Nov 24 13:19:05 compute-1 nova_compute[187078]: 2025-11-24 13:19:05.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:06 compute-1 nova_compute[187078]: 2025-11-24 13:19:06.071 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:06 compute-1 ovn_controller[95368]: 2025-11-24T13:19:06Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:91:48 10.100.0.11
Nov 24 13:19:06 compute-1 ovn_controller[95368]: 2025-11-24T13:19:06Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:91:48 10.100.0.11
Nov 24 13:19:06 compute-1 nova_compute[187078]: 2025-11-24 13:19:06.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:06 compute-1 nova_compute[187078]: 2025-11-24 13:19:06.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:06 compute-1 nova_compute[187078]: 2025-11-24 13:19:06.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:19:06 compute-1 nova_compute[187078]: 2025-11-24 13:19:06.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:19:07 compute-1 nova_compute[187078]: 2025-11-24 13:19:07.218 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:07 compute-1 nova_compute[187078]: 2025-11-24 13:19:07.828 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:19:07 compute-1 nova_compute[187078]: 2025-11-24 13:19:07.829 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:19:07 compute-1 nova_compute[187078]: 2025-11-24 13:19:07.829 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:19:07 compute-1 nova_compute[187078]: 2025-11-24 13:19:07.829 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:09 compute-1 sshd-session[209367]: Invalid user monitoring from 45.78.194.40 port 49514
Nov 24 13:19:09 compute-1 sshd-session[209367]: Received disconnect from 45.78.194.40 port 49514:11: Bye Bye [preauth]
Nov 24 13:19:09 compute-1 sshd-session[209367]: Disconnected from invalid user monitoring 45.78.194.40 port 49514 [preauth]
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.075 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.137 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating instance_info_cache with network_info: [{"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.148 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-e949dac9-04e8-4bf5-b73c-32ab3fc59472" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.148 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.149 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.149 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.167 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.169 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.169 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.170 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.240 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.306 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.308 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.361 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.367 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.422 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.423 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.488 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.677 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.679 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5543MB free_disk=73.40778350830078GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.679 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.679 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.745 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance e949dac9-04e8-4bf5-b73c-32ab3fc59472 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.746 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.746 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.747 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.807 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.819 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.840 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:19:11 compute-1 nova_compute[187078]: 2025-11-24 13:19:11.840 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:12 compute-1 nova_compute[187078]: 2025-11-24 13:19:12.222 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:16 compute-1 nova_compute[187078]: 2025-11-24 13:19:16.078 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:17 compute-1 nova_compute[187078]: 2025-11-24 13:19:17.225 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:19 compute-1 openstack_network_exporter[199599]: ERROR   13:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:19:19 compute-1 openstack_network_exporter[199599]: ERROR   13:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:19:19 compute-1 openstack_network_exporter[199599]: ERROR   13:19:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:19:19 compute-1 openstack_network_exporter[199599]: ERROR   13:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:19:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:19:19 compute-1 openstack_network_exporter[199599]: ERROR   13:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:19:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:19:21 compute-1 nova_compute[187078]: 2025-11-24 13:19:21.080 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:21 compute-1 podman[209420]: 2025-11-24 13:19:21.569151311 +0000 UTC m=+0.098770105 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:19:22 compute-1 nova_compute[187078]: 2025-11-24 13:19:22.229 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:23 compute-1 podman[209444]: 2025-11-24 13:19:23.528516878 +0000 UTC m=+0.072989771 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.679 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.679 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.697 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.772 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.773 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.786 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.787 187082 INFO nova.compute.claims [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.930 187082 DEBUG nova.compute.provider_tree [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.941 187082 DEBUG nova.scheduler.client.report [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.960 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:23 compute-1 nova_compute[187078]: 2025-11-24 13:19:23.961 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.000 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.001 187082 DEBUG nova.network.neutron [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.017 187082 INFO nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.037 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.145 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.146 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.147 187082 INFO nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Creating image(s)
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.147 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "/var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.148 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "/var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.148 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "/var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.161 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.220 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.222 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.222 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.237 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.307 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.308 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.344 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.345 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.346 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.410 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.411 187082 DEBUG nova.virt.disk.api [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Checking if we can resize image /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.412 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.514 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.515 187082 DEBUG nova.virt.disk.api [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Cannot resize image /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.515 187082 DEBUG nova.objects.instance [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'migration_context' on Instance uuid 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.526 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.527 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Ensure instance console log exists: /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.527 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.528 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:24 compute-1 nova_compute[187078]: 2025-11-24 13:19:24.528 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:25 compute-1 nova_compute[187078]: 2025-11-24 13:19:25.172 187082 DEBUG nova.policy [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bad71b4865594b828cc87e37a3107bc4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:19:26 compute-1 sshd-session[209479]: Invalid user postgres from 175.100.24.139 port 47704
Nov 24 13:19:26 compute-1 nova_compute[187078]: 2025-11-24 13:19:26.083 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:26 compute-1 nova_compute[187078]: 2025-11-24 13:19:26.157 187082 DEBUG nova.network.neutron [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Successfully created port: 871566ba-afd5-437e-944d-0d0e6c395933 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:19:26 compute-1 sshd-session[209479]: Received disconnect from 175.100.24.139 port 47704:11: Bye Bye [preauth]
Nov 24 13:19:26 compute-1 sshd-session[209479]: Disconnected from invalid user postgres 175.100.24.139 port 47704 [preauth]
Nov 24 13:19:27 compute-1 nova_compute[187078]: 2025-11-24 13:19:27.230 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:27 compute-1 podman[209481]: 2025-11-24 13:19:27.517854921 +0000 UTC m=+0.063340049 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:19:27 compute-1 podman[209482]: 2025-11-24 13:19:27.561056588 +0000 UTC m=+0.098146687 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 24 13:19:28 compute-1 nova_compute[187078]: 2025-11-24 13:19:28.031 187082 DEBUG nova.network.neutron [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Successfully updated port: 871566ba-afd5-437e-944d-0d0e6c395933 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:19:28 compute-1 nova_compute[187078]: 2025-11-24 13:19:28.047 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "refresh_cache-0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:19:28 compute-1 nova_compute[187078]: 2025-11-24 13:19:28.048 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquired lock "refresh_cache-0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:19:28 compute-1 nova_compute[187078]: 2025-11-24 13:19:28.048 187082 DEBUG nova.network.neutron [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:19:28 compute-1 nova_compute[187078]: 2025-11-24 13:19:28.131 187082 DEBUG nova.compute.manager [req-b32e2df7-9ca7-47e3-b2fe-40b940936f91 req-d2462813-76cc-4fb6-8f49-d452e4c4adca 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received event network-changed-871566ba-afd5-437e-944d-0d0e6c395933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:28 compute-1 nova_compute[187078]: 2025-11-24 13:19:28.132 187082 DEBUG nova.compute.manager [req-b32e2df7-9ca7-47e3-b2fe-40b940936f91 req-d2462813-76cc-4fb6-8f49-d452e4c4adca 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Refreshing instance network info cache due to event network-changed-871566ba-afd5-437e-944d-0d0e6c395933. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:19:28 compute-1 nova_compute[187078]: 2025-11-24 13:19:28.132 187082 DEBUG oslo_concurrency.lockutils [req-b32e2df7-9ca7-47e3-b2fe-40b940936f91 req-d2462813-76cc-4fb6-8f49-d452e4c4adca 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:19:28 compute-1 nova_compute[187078]: 2025-11-24 13:19:28.237 187082 DEBUG nova.network.neutron [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.689 187082 DEBUG nova.network.neutron [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Updating instance_info_cache with network_info: [{"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.713 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Releasing lock "refresh_cache-0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.713 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Instance network_info: |[{"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.714 187082 DEBUG oslo_concurrency.lockutils [req-b32e2df7-9ca7-47e3-b2fe-40b940936f91 req-d2462813-76cc-4fb6-8f49-d452e4c4adca 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.714 187082 DEBUG nova.network.neutron [req-b32e2df7-9ca7-47e3-b2fe-40b940936f91 req-d2462813-76cc-4fb6-8f49-d452e4c4adca 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Refreshing network info cache for port 871566ba-afd5-437e-944d-0d0e6c395933 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.718 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Start _get_guest_xml network_info=[{"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.724 187082 WARNING nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.737 187082 DEBUG nova.virt.libvirt.host [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.737 187082 DEBUG nova.virt.libvirt.host [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.742 187082 DEBUG nova.virt.libvirt.host [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.743 187082 DEBUG nova.virt.libvirt.host [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.744 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.745 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.745 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.746 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.746 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.746 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.747 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.747 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.748 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.748 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.748 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.749 187082 DEBUG nova.virt.hardware [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.753 187082 DEBUG nova.virt.libvirt.vif [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1778465145',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1778465145',id=6,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-t63ko85t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:19:24Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.754 187082 DEBUG nova.network.os_vif_util [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.755 187082 DEBUG nova.network.os_vif_util [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:2e:a1,bridge_name='br-int',has_traffic_filtering=True,id=871566ba-afd5-437e-944d-0d0e6c395933,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap871566ba-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.756 187082 DEBUG nova.objects.instance [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'pci_devices' on Instance uuid 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.770 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <uuid>0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a</uuid>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <name>instance-00000006</name>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1778465145</nova:name>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:19:29</nova:creationTime>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:19:29 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:19:29 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:19:29 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:19:29 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:19:29 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:19:29 compute-1 nova_compute[187078]:         <nova:user uuid="bad71b4865594b828cc87e37a3107bc4">tempest-TestExecuteActionsViaActuator-1613857129-project-member</nova:user>
Nov 24 13:19:29 compute-1 nova_compute[187078]:         <nova:project uuid="5383ea8abbd144f89d959d9b1f9c052f">tempest-TestExecuteActionsViaActuator-1613857129</nova:project>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:19:29 compute-1 nova_compute[187078]:         <nova:port uuid="871566ba-afd5-437e-944d-0d0e6c395933">
Nov 24 13:19:29 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <system>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <entry name="serial">0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a</entry>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <entry name="uuid">0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a</entry>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </system>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <os>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   </os>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <features>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   </features>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk.config"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:4b:2e:a1"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <target dev="tap871566ba-af"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/console.log" append="off"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <video>
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </video>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:19:29 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:19:29 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:19:29 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:19:29 compute-1 nova_compute[187078]: </domain>
Nov 24 13:19:29 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.771 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Preparing to wait for external event network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.771 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.772 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.772 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.773 187082 DEBUG nova.virt.libvirt.vif [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1778465145',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1778465145',id=6,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-t63ko85t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:19:24Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.773 187082 DEBUG nova.network.os_vif_util [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.774 187082 DEBUG nova.network.os_vif_util [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:2e:a1,bridge_name='br-int',has_traffic_filtering=True,id=871566ba-afd5-437e-944d-0d0e6c395933,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap871566ba-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.774 187082 DEBUG os_vif [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:2e:a1,bridge_name='br-int',has_traffic_filtering=True,id=871566ba-afd5-437e-944d-0d0e6c395933,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap871566ba-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.775 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.775 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.775 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.778 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.778 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap871566ba-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.778 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap871566ba-af, col_values=(('external_ids', {'iface-id': '871566ba-afd5-437e-944d-0d0e6c395933', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4b:2e:a1', 'vm-uuid': '0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.780 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:29 compute-1 NetworkManager[55527]: <info>  [1763990369.7816] manager: (tap871566ba-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.784 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.790 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.792 187082 INFO os_vif [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:2e:a1,bridge_name='br-int',has_traffic_filtering=True,id=871566ba-afd5-437e-944d-0d0e6c395933,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap871566ba-af')
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.955 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.956 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.956 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] No VIF found with MAC fa:16:3e:4b:2e:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:19:29 compute-1 nova_compute[187078]: 2025-11-24 13:19:29.957 187082 INFO nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Using config drive
Nov 24 13:19:30 compute-1 nova_compute[187078]: 2025-11-24 13:19:30.367 187082 INFO nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Creating config drive at /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk.config
Nov 24 13:19:30 compute-1 nova_compute[187078]: 2025-11-24 13:19:30.372 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp87b_ugp5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:30 compute-1 nova_compute[187078]: 2025-11-24 13:19:30.496 187082 DEBUG oslo_concurrency.processutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp87b_ugp5" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:30 compute-1 kernel: tap871566ba-af: entered promiscuous mode
Nov 24 13:19:30 compute-1 NetworkManager[55527]: <info>  [1763990370.5549] manager: (tap871566ba-af): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Nov 24 13:19:30 compute-1 ovn_controller[95368]: 2025-11-24T13:19:30Z|00045|binding|INFO|Claiming lport 871566ba-afd5-437e-944d-0d0e6c395933 for this chassis.
Nov 24 13:19:30 compute-1 ovn_controller[95368]: 2025-11-24T13:19:30Z|00046|binding|INFO|871566ba-afd5-437e-944d-0d0e6c395933: Claiming fa:16:3e:4b:2e:a1 10.100.0.8
Nov 24 13:19:30 compute-1 nova_compute[187078]: 2025-11-24 13:19:30.556 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:30 compute-1 ovn_controller[95368]: 2025-11-24T13:19:30Z|00047|binding|INFO|Setting lport 871566ba-afd5-437e-944d-0d0e6c395933 ovn-installed in OVS
Nov 24 13:19:30 compute-1 nova_compute[187078]: 2025-11-24 13:19:30.572 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:30 compute-1 nova_compute[187078]: 2025-11-24 13:19:30.574 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:30 compute-1 systemd-udevd[209543]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:19:30 compute-1 NetworkManager[55527]: <info>  [1763990370.5977] device (tap871566ba-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:19:30 compute-1 NetworkManager[55527]: <info>  [1763990370.5985] device (tap871566ba-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:19:30 compute-1 systemd-machined[153355]: New machine qemu-4-instance-00000006.
Nov 24 13:19:30 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000006.
Nov 24 13:19:30 compute-1 ovn_controller[95368]: 2025-11-24T13:19:30Z|00048|binding|INFO|Setting lport 871566ba-afd5-437e-944d-0d0e6c395933 up in Southbound
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.695 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:2e:a1 10.100.0.8'], port_security=['fa:16:3e:4b:2e:a1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=871566ba-afd5-437e-944d-0d0e6c395933) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.698 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 871566ba-afd5-437e-944d-0d0e6c395933 in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 bound to our chassis
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.700 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.716 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[36be1edc-bfa2-48d3-babe-759da944379a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.740 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[d134c64f-3a04-4900-a0c0-e49c9b8eeb8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.743 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[8f417077-8a6b-4b4a-b627-38804b1ac401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.773 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[2a80736e-bf5a-4e45-a9d4-45b41030480f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.788 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[da9b723d-d489-47aa-8a24-385c41a26e55]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209560, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.801 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[914e57ae-079d-4c69-bdb2-3e61231a3783]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333073, 'tstamp': 333073}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209561, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333078, 'tstamp': 333078}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209561, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.802 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:30 compute-1 nova_compute[187078]: 2025-11-24 13:19:30.804 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:30 compute-1 nova_compute[187078]: 2025-11-24 13:19:30.805 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.805 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.805 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.806 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:30 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:30.806 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.166 187082 DEBUG nova.compute.manager [req-c7936cbf-d5fc-44bc-9d09-4d7d4a17f57b req-eb5b9d50-7513-4816-a79a-d1fbee7f8496 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received event network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.166 187082 DEBUG oslo_concurrency.lockutils [req-c7936cbf-d5fc-44bc-9d09-4d7d4a17f57b req-eb5b9d50-7513-4816-a79a-d1fbee7f8496 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.166 187082 DEBUG oslo_concurrency.lockutils [req-c7936cbf-d5fc-44bc-9d09-4d7d4a17f57b req-eb5b9d50-7513-4816-a79a-d1fbee7f8496 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.167 187082 DEBUG oslo_concurrency.lockutils [req-c7936cbf-d5fc-44bc-9d09-4d7d4a17f57b req-eb5b9d50-7513-4816-a79a-d1fbee7f8496 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.167 187082 DEBUG nova.compute.manager [req-c7936cbf-d5fc-44bc-9d09-4d7d4a17f57b req-eb5b9d50-7513-4816-a79a-d1fbee7f8496 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Processing event network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.296 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.297 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990371.2970846, 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.297 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] VM Started (Lifecycle Event)
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.300 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.304 187082 INFO nova.virt.libvirt.driver [-] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Instance spawned successfully.
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.305 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.390 187082 DEBUG nova.network.neutron [req-b32e2df7-9ca7-47e3-b2fe-40b940936f91 req-d2462813-76cc-4fb6-8f49-d452e4c4adca 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Updated VIF entry in instance network info cache for port 871566ba-afd5-437e-944d-0d0e6c395933. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.390 187082 DEBUG nova.network.neutron [req-b32e2df7-9ca7-47e3-b2fe-40b940936f91 req-d2462813-76cc-4fb6-8f49-d452e4c4adca 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Updating instance_info_cache with network_info: [{"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.397 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.402 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.402 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.402 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.403 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.403 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.404 187082 DEBUG nova.virt.libvirt.driver [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.409 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.414 187082 DEBUG oslo_concurrency.lockutils [req-b32e2df7-9ca7-47e3-b2fe-40b940936f91 req-d2462813-76cc-4fb6-8f49-d452e4c4adca 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.441 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.441 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990371.2973647, 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.442 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] VM Paused (Lifecycle Event)
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.454 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.458 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990371.300565, 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.459 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] VM Resumed (Lifecycle Event)
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.473 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.477 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.495 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.803 187082 INFO nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Took 7.66 seconds to spawn the instance on the hypervisor.
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.803 187082 DEBUG nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.876 187082 INFO nova.compute.manager [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Took 8.14 seconds to build instance.
Nov 24 13:19:31 compute-1 nova_compute[187078]: 2025-11-24 13:19:31.971 187082 DEBUG oslo_concurrency.lockutils [None req-1c3d650a-eea3-4178-961d-70df47d97f09 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:32.137 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:19:32 compute-1 nova_compute[187078]: 2025-11-24 13:19:32.138 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:32 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:32.138 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:19:32 compute-1 nova_compute[187078]: 2025-11-24 13:19:32.279 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:33 compute-1 nova_compute[187078]: 2025-11-24 13:19:33.235 187082 DEBUG nova.compute.manager [req-063c8c8f-0ee4-4e56-924c-b7941a304442 req-280604ae-415e-4b51-b2fb-eaee27e05f44 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received event network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:33 compute-1 nova_compute[187078]: 2025-11-24 13:19:33.235 187082 DEBUG oslo_concurrency.lockutils [req-063c8c8f-0ee4-4e56-924c-b7941a304442 req-280604ae-415e-4b51-b2fb-eaee27e05f44 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:33 compute-1 nova_compute[187078]: 2025-11-24 13:19:33.236 187082 DEBUG oslo_concurrency.lockutils [req-063c8c8f-0ee4-4e56-924c-b7941a304442 req-280604ae-415e-4b51-b2fb-eaee27e05f44 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:33 compute-1 nova_compute[187078]: 2025-11-24 13:19:33.236 187082 DEBUG oslo_concurrency.lockutils [req-063c8c8f-0ee4-4e56-924c-b7941a304442 req-280604ae-415e-4b51-b2fb-eaee27e05f44 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:33 compute-1 nova_compute[187078]: 2025-11-24 13:19:33.236 187082 DEBUG nova.compute.manager [req-063c8c8f-0ee4-4e56-924c-b7941a304442 req-280604ae-415e-4b51-b2fb-eaee27e05f44 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] No waiting events found dispatching network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:33 compute-1 nova_compute[187078]: 2025-11-24 13:19:33.236 187082 WARNING nova.compute.manager [req-063c8c8f-0ee4-4e56-924c-b7941a304442 req-280604ae-415e-4b51-b2fb-eaee27e05f44 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received unexpected event network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 for instance with vm_state active and task_state None.
Nov 24 13:19:34 compute-1 nova_compute[187078]: 2025-11-24 13:19:34.783 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:35 compute-1 podman[197429]: time="2025-11-24T13:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:19:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:19:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Nov 24 13:19:36 compute-1 podman[209569]: 2025-11-24 13:19:36.521810237 +0000 UTC m=+0.063688277 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 13:19:37 compute-1 nova_compute[187078]: 2025-11-24 13:19:37.283 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:38 compute-1 sshd-session[209591]: Invalid user vendas from 5.198.176.28 port 43596
Nov 24 13:19:38 compute-1 sshd-session[209591]: Received disconnect from 5.198.176.28 port 43596:11: Bye Bye [preauth]
Nov 24 13:19:38 compute-1 sshd-session[209591]: Disconnected from invalid user vendas 5.198.176.28 port 43596 [preauth]
Nov 24 13:19:39 compute-1 nova_compute[187078]: 2025-11-24 13:19:39.787 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:40.147 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.460 187082 DEBUG nova.compute.manager [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.544 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.545 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.569 187082 DEBUG nova.objects.instance [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'pci_requests' on Instance uuid 26455831-8aa8-4319-ae19-8d4c064580c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.585 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.586 187082 INFO nova.compute.claims [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.587 187082 DEBUG nova.objects.instance [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'resources' on Instance uuid 26455831-8aa8-4319-ae19-8d4c064580c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.597 187082 DEBUG nova.objects.instance [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'numa_topology' on Instance uuid 26455831-8aa8-4319-ae19-8d4c064580c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.606 187082 DEBUG nova.objects.instance [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'pci_devices' on Instance uuid 26455831-8aa8-4319-ae19-8d4c064580c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.641 187082 INFO nova.compute.resource_tracker [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Updating resource usage from migration 2022451c-cadc-4705-a344-7acb30118b3f
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.642 187082 DEBUG nova.compute.resource_tracker [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Starting to track incoming migration 2022451c-cadc-4705-a344-7acb30118b3f with flavor 9fb1ccae-4ba6-4040-a754-0b156b72dc25 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.675 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Check if temp file /var/lib/nova/instances/tmpnv5hvqh3 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.676 187082 DEBUG nova.compute.manager [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnv5hvqh3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.744 187082 DEBUG nova.compute.provider_tree [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.756 187082 DEBUG nova.scheduler.client.report [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.773 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:41 compute-1 nova_compute[187078]: 2025-11-24 13:19:41.774 187082 INFO nova.compute.manager [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Migrating
Nov 24 13:19:42 compute-1 nova_compute[187078]: 2025-11-24 13:19:42.284 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:44 compute-1 nova_compute[187078]: 2025-11-24 13:19:44.246 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:44 compute-1 nova_compute[187078]: 2025-11-24 13:19:44.302 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:44 compute-1 nova_compute[187078]: 2025-11-24 13:19:44.303 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:44 compute-1 nova_compute[187078]: 2025-11-24 13:19:44.373 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:44 compute-1 nova_compute[187078]: 2025-11-24 13:19:44.791 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:45 compute-1 sshd-session[209620]: Accepted publickey for nova from 192.168.122.100 port 46248 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:19:45 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Nov 24 13:19:45 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 24 13:19:45 compute-1 systemd-logind[815]: New session 28 of user nova.
Nov 24 13:19:45 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 24 13:19:45 compute-1 systemd[1]: Starting User Manager for UID 42436...
Nov 24 13:19:45 compute-1 systemd[209624]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:19:45 compute-1 systemd[209624]: Queued start job for default target Main User Target.
Nov 24 13:19:45 compute-1 systemd[209624]: Created slice User Application Slice.
Nov 24 13:19:45 compute-1 systemd[209624]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:19:45 compute-1 systemd[209624]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:19:45 compute-1 systemd[209624]: Reached target Paths.
Nov 24 13:19:45 compute-1 systemd[209624]: Reached target Timers.
Nov 24 13:19:45 compute-1 systemd[209624]: Starting D-Bus User Message Bus Socket...
Nov 24 13:19:45 compute-1 systemd[209624]: Starting Create User's Volatile Files and Directories...
Nov 24 13:19:45 compute-1 systemd[209624]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:19:45 compute-1 systemd[209624]: Reached target Sockets.
Nov 24 13:19:45 compute-1 systemd[209624]: Finished Create User's Volatile Files and Directories.
Nov 24 13:19:45 compute-1 systemd[209624]: Reached target Basic System.
Nov 24 13:19:45 compute-1 systemd[209624]: Reached target Main User Target.
Nov 24 13:19:45 compute-1 systemd[209624]: Startup finished in 146ms.
Nov 24 13:19:45 compute-1 systemd[1]: Started User Manager for UID 42436.
Nov 24 13:19:45 compute-1 systemd[1]: Started Session 28 of User nova.
Nov 24 13:19:45 compute-1 sshd-session[209620]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:19:45 compute-1 sshd-session[209651]: Received disconnect from 192.168.122.100 port 46248:11: disconnected by user
Nov 24 13:19:45 compute-1 sshd-session[209651]: Disconnected from user nova 192.168.122.100 port 46248
Nov 24 13:19:45 compute-1 sshd-session[209620]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:19:45 compute-1 systemd-logind[815]: Session 28 logged out. Waiting for processes to exit.
Nov 24 13:19:45 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Nov 24 13:19:45 compute-1 systemd-logind[815]: Removed session 28.
Nov 24 13:19:45 compute-1 sshd-session[209654]: Accepted publickey for nova from 192.168.122.100 port 46256 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:19:45 compute-1 systemd-logind[815]: New session 30 of user nova.
Nov 24 13:19:45 compute-1 systemd[1]: Started Session 30 of User nova.
Nov 24 13:19:45 compute-1 sshd-session[209654]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:19:45 compute-1 sshd-session[209657]: Received disconnect from 192.168.122.100 port 46256:11: disconnected by user
Nov 24 13:19:45 compute-1 sshd-session[209657]: Disconnected from user nova 192.168.122.100 port 46256
Nov 24 13:19:45 compute-1 sshd-session[209654]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:19:45 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Nov 24 13:19:45 compute-1 systemd-logind[815]: Session 30 logged out. Waiting for processes to exit.
Nov 24 13:19:45 compute-1 systemd-logind[815]: Removed session 30.
Nov 24 13:19:46 compute-1 ovn_controller[95368]: 2025-11-24T13:19:46Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4b:2e:a1 10.100.0.8
Nov 24 13:19:46 compute-1 ovn_controller[95368]: 2025-11-24T13:19:46Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4b:2e:a1 10.100.0.8
Nov 24 13:19:47 compute-1 nova_compute[187078]: 2025-11-24 13:19:47.286 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:47 compute-1 sshd-session[209659]: Accepted publickey for nova from 192.168.122.100 port 46272 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:19:47 compute-1 systemd-logind[815]: New session 31 of user nova.
Nov 24 13:19:47 compute-1 systemd[1]: Started Session 31 of User nova.
Nov 24 13:19:47 compute-1 sshd-session[209659]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:19:47 compute-1 sshd-session[209662]: Received disconnect from 192.168.122.100 port 46272:11: disconnected by user
Nov 24 13:19:47 compute-1 sshd-session[209662]: Disconnected from user nova 192.168.122.100 port 46272
Nov 24 13:19:47 compute-1 sshd-session[209659]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:19:47 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Nov 24 13:19:47 compute-1 systemd-logind[815]: Session 31 logged out. Waiting for processes to exit.
Nov 24 13:19:47 compute-1 systemd-logind[815]: Removed session 31.
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.189 187082 DEBUG nova.compute.manager [req-ec6d574f-c521-4ec0-8025-626a548c6217 req-2e286bbc-9e7d-4776-8b34-4fb050755b04 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received event network-vif-unplugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.190 187082 DEBUG oslo_concurrency.lockutils [req-ec6d574f-c521-4ec0-8025-626a548c6217 req-2e286bbc-9e7d-4776-8b34-4fb050755b04 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.190 187082 DEBUG oslo_concurrency.lockutils [req-ec6d574f-c521-4ec0-8025-626a548c6217 req-2e286bbc-9e7d-4776-8b34-4fb050755b04 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.190 187082 DEBUG oslo_concurrency.lockutils [req-ec6d574f-c521-4ec0-8025-626a548c6217 req-2e286bbc-9e7d-4776-8b34-4fb050755b04 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.190 187082 DEBUG nova.compute.manager [req-ec6d574f-c521-4ec0-8025-626a548c6217 req-2e286bbc-9e7d-4776-8b34-4fb050755b04 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] No waiting events found dispatching network-vif-unplugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.190 187082 WARNING nova.compute.manager [req-ec6d574f-c521-4ec0-8025-626a548c6217 req-2e286bbc-9e7d-4776-8b34-4fb050755b04 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received unexpected event network-vif-unplugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a for instance with vm_state active and task_state resize_migrating.
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.905 187082 INFO nova.compute.manager [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Took 4.53 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.905 187082 DEBUG nova.compute.manager [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.919 187082 DEBUG nova.compute.manager [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnv5hvqh3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(e32b0866-208a-4d7a-b9a8-d5c102eb761e),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.935 187082 DEBUG nova.objects.instance [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.936 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.938 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.938 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.950 187082 DEBUG nova.virt.libvirt.vif [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:18:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-132711386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-132711386',id=3,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:18:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-p4tqeznp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:18:48Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.950 187082 DEBUG nova.network.os_vif_util [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.951 187082 DEBUG nova.network.os_vif_util [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:91:48,bridge_name='br-int',has_traffic_filtering=True,id=ced2619b-6589-4d8e-be2b-6abff02aa6a5,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapced2619b-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.951 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Updating guest XML with vif config: <interface type="ethernet">
Nov 24 13:19:48 compute-1 nova_compute[187078]:   <mac address="fa:16:3e:88:91:48"/>
Nov 24 13:19:48 compute-1 nova_compute[187078]:   <model type="virtio"/>
Nov 24 13:19:48 compute-1 nova_compute[187078]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:19:48 compute-1 nova_compute[187078]:   <mtu size="1442"/>
Nov 24 13:19:48 compute-1 nova_compute[187078]:   <target dev="tapced2619b-65"/>
Nov 24 13:19:48 compute-1 nova_compute[187078]: </interface>
Nov 24 13:19:48 compute-1 nova_compute[187078]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 24 13:19:48 compute-1 nova_compute[187078]: 2025-11-24 13:19:48.952 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 24 13:19:49 compute-1 sshd-session[209664]: Accepted publickey for nova from 192.168.122.100 port 46276 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:19:49 compute-1 systemd-logind[815]: New session 32 of user nova.
Nov 24 13:19:49 compute-1 systemd[1]: Started Session 32 of User nova.
Nov 24 13:19:49 compute-1 sshd-session[209664]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:19:49 compute-1 openstack_network_exporter[199599]: ERROR   13:19:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:19:49 compute-1 openstack_network_exporter[199599]: ERROR   13:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:19:49 compute-1 openstack_network_exporter[199599]: ERROR   13:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:19:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:19:49 compute-1 openstack_network_exporter[199599]: ERROR   13:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:19:49 compute-1 openstack_network_exporter[199599]: ERROR   13:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:19:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:19:49 compute-1 nova_compute[187078]: 2025-11-24 13:19:49.441 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:19:49 compute-1 nova_compute[187078]: 2025-11-24 13:19:49.441 187082 INFO nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 24 13:19:49 compute-1 nova_compute[187078]: 2025-11-24 13:19:49.510 187082 INFO nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 24 13:19:49 compute-1 sshd-session[209667]: Received disconnect from 192.168.122.100 port 46276:11: disconnected by user
Nov 24 13:19:49 compute-1 sshd-session[209667]: Disconnected from user nova 192.168.122.100 port 46276
Nov 24 13:19:49 compute-1 sshd-session[209664]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:19:49 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Nov 24 13:19:49 compute-1 systemd-logind[815]: Session 32 logged out. Waiting for processes to exit.
Nov 24 13:19:49 compute-1 systemd-logind[815]: Removed session 32.
Nov 24 13:19:49 compute-1 nova_compute[187078]: 2025-11-24 13:19:49.793 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:49 compute-1 sshd-session[209669]: Accepted publickey for nova from 192.168.122.100 port 46292 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:19:49 compute-1 systemd-logind[815]: New session 33 of user nova.
Nov 24 13:19:49 compute-1 systemd[1]: Started Session 33 of User nova.
Nov 24 13:19:49 compute-1 sshd-session[209669]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:19:49 compute-1 sshd-session[209672]: Received disconnect from 192.168.122.100 port 46292:11: disconnected by user
Nov 24 13:19:49 compute-1 sshd-session[209672]: Disconnected from user nova 192.168.122.100 port 46292
Nov 24 13:19:49 compute-1 sshd-session[209669]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:19:49 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Nov 24 13:19:49 compute-1 systemd-logind[815]: Session 33 logged out. Waiting for processes to exit.
Nov 24 13:19:49 compute-1 systemd-logind[815]: Removed session 33.
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.012 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.014 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:19:50 compute-1 sshd-session[209674]: Accepted publickey for nova from 192.168.122.100 port 46298 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:19:50 compute-1 systemd-logind[815]: New session 34 of user nova.
Nov 24 13:19:50 compute-1 systemd[1]: Started Session 34 of User nova.
Nov 24 13:19:50 compute-1 sshd-session[209674]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:19:50 compute-1 sshd-session[209677]: Received disconnect from 192.168.122.100 port 46298:11: disconnected by user
Nov 24 13:19:50 compute-1 sshd-session[209677]: Disconnected from user nova 192.168.122.100 port 46298
Nov 24 13:19:50 compute-1 sshd-session[209674]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:19:50 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Nov 24 13:19:50 compute-1 systemd-logind[815]: Session 34 logged out. Waiting for processes to exit.
Nov 24 13:19:50 compute-1 systemd-logind[815]: Removed session 34.
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.307 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.307 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.308 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.308 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.308 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.308 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.309 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received event network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.309 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.309 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.309 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.309 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] No waiting events found dispatching network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.310 187082 WARNING nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received unexpected event network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a for instance with vm_state active and task_state resize_migrating.
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.310 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.310 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.310 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.310 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.311 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.311 187082 WARNING nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state migrating.
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.311 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-changed-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.311 187082 DEBUG nova.compute.manager [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Refreshing instance network info cache due to event network-changed-ced2619b-6589-4d8e-be2b-6abff02aa6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.312 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.312 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.312 187082 DEBUG nova.network.neutron [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Refreshing network info cache for port ced2619b-6589-4d8e-be2b-6abff02aa6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.516 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.517 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:19:50 compute-1 nova_compute[187078]: 2025-11-24 13:19:50.855 187082 INFO nova.network.neutron [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Updating port d61a5c3c-44c2-4a1b-bb74-6615866e2f1a with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 24 13:19:51 compute-1 nova_compute[187078]: 2025-11-24 13:19:51.019 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:19:51 compute-1 nova_compute[187078]: 2025-11-24 13:19:51.019 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:19:51 compute-1 nova_compute[187078]: 2025-11-24 13:19:51.523 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:19:51 compute-1 nova_compute[187078]: 2025-11-24 13:19:51.524 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:19:51 compute-1 nova_compute[187078]: 2025-11-24 13:19:51.850 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-26455831-8aa8-4319-ae19-8d4c064580c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:19:51 compute-1 nova_compute[187078]: 2025-11-24 13:19:51.851 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-26455831-8aa8-4319-ae19-8d4c064580c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:19:51 compute-1 nova_compute[187078]: 2025-11-24 13:19:51.851 187082 DEBUG nova.network.neutron [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.027 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.028 187082 DEBUG nova.virt.libvirt.migration [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.065 187082 DEBUG nova.network.neutron [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Updated VIF entry in instance network info cache for port ced2619b-6589-4d8e-be2b-6abff02aa6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.066 187082 DEBUG nova.network.neutron [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Updating instance_info_cache with network_info: [{"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.088 187082 DEBUG oslo_concurrency.lockutils [req-f669bb21-51c2-408d-bb53-6fe988172ad4 req-ac43f46c-6ed3-4d82-a3f5-3368c7a64e3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.110 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990392.1097527, bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.110 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] VM Paused (Lifecycle Event)
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.129 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.140 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.162 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.288 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.295 187082 DEBUG nova.compute.manager [req-b9b78133-3f97-424c-8b3e-1f7e251f2030 req-e2e6514c-cece-4d55-be38-352f67759b8c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received event network-changed-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.295 187082 DEBUG nova.compute.manager [req-b9b78133-3f97-424c-8b3e-1f7e251f2030 req-e2e6514c-cece-4d55-be38-352f67759b8c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Refreshing instance network info cache due to event network-changed-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.295 187082 DEBUG oslo_concurrency.lockutils [req-b9b78133-3f97-424c-8b3e-1f7e251f2030 req-e2e6514c-cece-4d55-be38-352f67759b8c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-26455831-8aa8-4319-ae19-8d4c064580c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:19:52 compute-1 kernel: tapced2619b-65 (unregistering): left promiscuous mode
Nov 24 13:19:52 compute-1 NetworkManager[55527]: <info>  [1763990392.3027] device (tapced2619b-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.312 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00049|binding|INFO|Releasing lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 from this chassis (sb_readonly=0)
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00050|binding|INFO|Setting lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 down in Southbound
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00051|binding|INFO|Removing iface tapced2619b-65 ovn-installed in OVS
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.314 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.334 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.331 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:91:48 10.100.0.11'], port_security=['fa:16:3e:88:91:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=ced2619b-6589-4d8e-be2b-6abff02aa6a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.332 104225 INFO neutron.agent.ovn.metadata.agent [-] Port ced2619b-6589-4d8e-be2b-6abff02aa6a5 in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 unbound from our chassis
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.334 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:19:52 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 24 13:19:52 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 20.040s CPU time.
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.352 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[df8a9a05-6943-4519-8492-e300690340d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 systemd-machined[153355]: Machine qemu-3-instance-00000003 terminated.
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.377 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[946aa295-96e4-45c3-88bb-246cac807b16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.380 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[a556a93f-19e9-48a4-8223-ddcd23e2a18d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 podman[209697]: 2025-11-24 13:19:52.384801064 +0000 UTC m=+0.054392045 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.406 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[fd4d318a-f0bd-4e63-b969-b16c7d3569c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.421 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[05cfda1a-e74d-438f-8614-da5f8c4bb25e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 9, 'rx_bytes': 742, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 9, 'rx_bytes': 742, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209734, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.435 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ad995ca2-203d-4b6e-ab85-9b0458d5055c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333073, 'tstamp': 333073}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209735, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333078, 'tstamp': 333078}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209735, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.437 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.438 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.443 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.444 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.444 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.445 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.445 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:52 compute-1 kernel: tapced2619b-65: entered promiscuous mode
Nov 24 13:19:52 compute-1 NetworkManager[55527]: <info>  [1763990392.4854] manager: (tapced2619b-65): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Nov 24 13:19:52 compute-1 kernel: tapced2619b-65 (unregistering): left promiscuous mode
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.490 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00052|binding|INFO|Claiming lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 for this chassis.
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00053|binding|INFO|ced2619b-6589-4d8e-be2b-6abff02aa6a5: Claiming fa:16:3e:88:91:48 10.100.0.11
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.502 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:91:48 10.100.0.11'], port_security=['fa:16:3e:88:91:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=ced2619b-6589-4d8e-be2b-6abff02aa6a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.503 104225 INFO neutron.agent.ovn.metadata.agent [-] Port ced2619b-6589-4d8e-be2b-6abff02aa6a5 in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 bound to our chassis
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.506 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00054|binding|INFO|Setting lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 ovn-installed in OVS
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00055|binding|INFO|Setting lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 up in Southbound
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00056|binding|INFO|Releasing lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 from this chassis (sb_readonly=1)
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00057|if_status|INFO|Not setting lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 down as sb is readonly
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00058|binding|INFO|Removing iface tapced2619b-65 ovn-installed in OVS
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.509 187082 DEBUG nova.compute.manager [req-81ec4d64-0996-4a3c-bb67-2610794b375b req-aaa4f53b-eb05-4a4f-9bd4-abdd556c929d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.509 187082 DEBUG oslo_concurrency.lockutils [req-81ec4d64-0996-4a3c-bb67-2610794b375b req-aaa4f53b-eb05-4a4f-9bd4-abdd556c929d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.509 187082 DEBUG oslo_concurrency.lockutils [req-81ec4d64-0996-4a3c-bb67-2610794b375b req-aaa4f53b-eb05-4a4f-9bd4-abdd556c929d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.509 187082 DEBUG oslo_concurrency.lockutils [req-81ec4d64-0996-4a3c-bb67-2610794b375b req-aaa4f53b-eb05-4a4f-9bd4-abdd556c929d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.510 187082 DEBUG nova.compute.manager [req-81ec4d64-0996-4a3c-bb67-2610794b375b req-aaa4f53b-eb05-4a4f-9bd4-abdd556c929d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.510 187082 DEBUG nova.compute.manager [req-81ec4d64-0996-4a3c-bb67-2610794b375b req-aaa4f53b-eb05-4a4f-9bd4-abdd556c929d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.510 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00059|binding|INFO|Releasing lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 from this chassis (sb_readonly=0)
Nov 24 13:19:52 compute-1 ovn_controller[95368]: 2025-11-24T13:19:52Z|00060|binding|INFO|Setting lport ced2619b-6589-4d8e-be2b-6abff02aa6a5 down in Southbound
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.518 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:91:48 10.100.0.11'], port_security=['fa:16:3e:88:91:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=ced2619b-6589-4d8e-be2b-6abff02aa6a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.519 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.519 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b1110a5c-53b3-454f-9df9-6537a2314fd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.533 187082 DEBUG nova.virt.libvirt.guest [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.534 187082 INFO nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Migration operation has completed
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.534 187082 INFO nova.compute.manager [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] _post_live_migration() is started..
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.538 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.539 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.539 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.544 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[fdda60ee-6e03-4d8e-9331-044c77ead089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.547 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[661a7013-a8df-45b4-a87d-736d93916cc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.572 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[5902d797-a4a9-48f8-9671-3e8e7bfec827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.588 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ca3ad5-7eee-4dd4-a019-519e8332d087]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 11, 'rx_bytes': 742, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 11, 'rx_bytes': 742, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209753, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.602 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[29546983-01ba-4ed4-86da-2548cbb2e2eb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333073, 'tstamp': 333073}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209754, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333078, 'tstamp': 333078}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209754, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.603 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.604 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.608 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.608 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.609 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.609 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.609 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.610 104225 INFO neutron.agent.ovn.metadata.agent [-] Port ced2619b-6589-4d8e-be2b-6abff02aa6a5 in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 unbound from our chassis
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.611 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.625 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[59abf480-6092-4f29-be95-e4429dfe79b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.653 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[d7049528-dd84-47d3-b742-20ab2123dd40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.656 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7df55f-8bb4-4eb3-96ab-5c77529fd2cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.691 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[b5712503-c0a6-43ab-a30d-e65fd1ce43be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.712 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[54d5d293-9658-43e3-bf40-a008b57893b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 13, 'rx_bytes': 742, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 13, 'rx_bytes': 742, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209761, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.728 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[047aba62-2341-4fed-b010-b990ac43118b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333073, 'tstamp': 333073}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209762, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333078, 'tstamp': 333078}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209762, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.730 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.731 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 nova_compute[187078]: 2025-11-24 13:19:52.735 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.736 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.736 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.736 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:52.736 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.212 187082 DEBUG nova.network.neutron [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Updating instance_info_cache with network_info: [{"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.234 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-26455831-8aa8-4319-ae19-8d4c064580c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.238 187082 DEBUG oslo_concurrency.lockutils [req-b9b78133-3f97-424c-8b3e-1f7e251f2030 req-e2e6514c-cece-4d55-be38-352f67759b8c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-26455831-8aa8-4319-ae19-8d4c064580c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.238 187082 DEBUG nova.network.neutron [req-b9b78133-3f97-424c-8b3e-1f7e251f2030 req-e2e6514c-cece-4d55-be38-352f67759b8c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Refreshing network info cache for port d61a5c3c-44c2-4a1b-bb74-6615866e2f1a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.318 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.320 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.320 187082 INFO nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Creating image(s)
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.321 187082 DEBUG nova.objects.instance [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 26455831-8aa8-4319-ae19-8d4c064580c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.329 187082 DEBUG oslo_concurrency.processutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.384 187082 DEBUG oslo_concurrency.processutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.386 187082 DEBUG nova.virt.disk.api [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Checking if we can resize image /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.386 187082 DEBUG oslo_concurrency.processutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.444 187082 DEBUG oslo_concurrency.processutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.445 187082 DEBUG nova.virt.disk.api [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Cannot resize image /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.457 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.458 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Ensure instance console log exists: /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.458 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.459 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.459 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.462 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Start _get_guest_xml network_info=[{"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:c6:4a:b5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.468 187082 WARNING nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.485 187082 DEBUG nova.virt.libvirt.host [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.486 187082 DEBUG nova.virt.libvirt.host [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.490 187082 DEBUG nova.virt.libvirt.host [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.491 187082 DEBUG nova.virt.libvirt.host [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.493 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.493 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.494 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.494 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.494 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.495 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.495 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.495 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.495 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.496 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.496 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.496 187082 DEBUG nova.virt.hardware [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.497 187082 DEBUG nova.objects.instance [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 26455831-8aa8-4319-ae19-8d4c064580c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.512 187082 DEBUG oslo_concurrency.processutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.567 187082 DEBUG oslo_concurrency.processutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk.config --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.568 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "/var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.568 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "/var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.569 187082 DEBUG oslo_concurrency.lockutils [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "/var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.570 187082 DEBUG nova.virt.libvirt.vif [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:19:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1452138830',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1452138830',id=5,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:19:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-hf1m3cwa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:19:50Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=26455831-8aa8-4319-ae19-8d4c064580c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:c6:4a:b5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.571 187082 DEBUG nova.network.os_vif_util [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:c6:4a:b5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.572 187082 DEBUG nova.network.os_vif_util [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:4a:b5,bridge_name='br-int',has_traffic_filtering=True,id=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd61a5c3c-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.574 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <uuid>26455831-8aa8-4319-ae19-8d4c064580c2</uuid>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <name>instance-00000005</name>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1452138830</nova:name>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:19:53</nova:creationTime>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:19:53 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:19:53 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:19:53 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:19:53 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:19:53 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:19:53 compute-1 nova_compute[187078]:         <nova:user uuid="bad71b4865594b828cc87e37a3107bc4">tempest-TestExecuteActionsViaActuator-1613857129-project-member</nova:user>
Nov 24 13:19:53 compute-1 nova_compute[187078]:         <nova:project uuid="5383ea8abbd144f89d959d9b1f9c052f">tempest-TestExecuteActionsViaActuator-1613857129</nova:project>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:19:53 compute-1 nova_compute[187078]:         <nova:port uuid="d61a5c3c-44c2-4a1b-bb74-6615866e2f1a">
Nov 24 13:19:53 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <system>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <entry name="serial">26455831-8aa8-4319-ae19-8d4c064580c2</entry>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <entry name="uuid">26455831-8aa8-4319-ae19-8d4c064580c2</entry>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </system>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <os>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   </os>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <features>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   </features>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk.config"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:c6:4a:b5"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <target dev="tapd61a5c3c-44"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/console.log" append="off"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <video>
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </video>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:19:53 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:19:53 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:19:53 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:19:53 compute-1 nova_compute[187078]: </domain>
Nov 24 13:19:53 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.575 187082 DEBUG nova.virt.libvirt.vif [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:19:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1452138830',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1452138830',id=5,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:19:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-hf1m3cwa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:19:50Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=26455831-8aa8-4319-ae19-8d4c064580c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:c6:4a:b5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.576 187082 DEBUG nova.network.os_vif_util [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "vif_mac": "fa:16:3e:c6:4a:b5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.576 187082 DEBUG nova.network.os_vif_util [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:4a:b5,bridge_name='br-int',has_traffic_filtering=True,id=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd61a5c3c-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.577 187082 DEBUG os_vif [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:4a:b5,bridge_name='br-int',has_traffic_filtering=True,id=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd61a5c3c-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.577 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.578 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.578 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.580 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.581 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd61a5c3c-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.581 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd61a5c3c-44, col_values=(('external_ids', {'iface-id': 'd61a5c3c-44c2-4a1b-bb74-6615866e2f1a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:4a:b5', 'vm-uuid': '26455831-8aa8-4319-ae19-8d4c064580c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.582 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 NetworkManager[55527]: <info>  [1763990393.5836] manager: (tapd61a5c3c-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.585 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.589 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.590 187082 INFO os_vif [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:4a:b5,bridge_name='br-int',has_traffic_filtering=True,id=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd61a5c3c-44')
Nov 24 13:19:53 compute-1 podman[209775]: 2025-11-24 13:19:53.681658768 +0000 UTC m=+0.054557469 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.742 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.742 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.742 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No VIF found with MAC fa:16:3e:c6:4a:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.743 187082 INFO nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Using config drive
Nov 24 13:19:53 compute-1 NetworkManager[55527]: <info>  [1763990393.7945] manager: (tapd61a5c3c-44): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Nov 24 13:19:53 compute-1 systemd-udevd[209710]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:19:53 compute-1 kernel: tapd61a5c3c-44: entered promiscuous mode
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.799 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 ovn_controller[95368]: 2025-11-24T13:19:53Z|00061|binding|INFO|Claiming lport d61a5c3c-44c2-4a1b-bb74-6615866e2f1a for this chassis.
Nov 24 13:19:53 compute-1 ovn_controller[95368]: 2025-11-24T13:19:53Z|00062|binding|INFO|d61a5c3c-44c2-4a1b-bb74-6615866e2f1a: Claiming fa:16:3e:c6:4a:b5 10.100.0.9
Nov 24 13:19:53 compute-1 NetworkManager[55527]: <info>  [1763990393.8075] device (tapd61a5c3c-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:19:53 compute-1 NetworkManager[55527]: <info>  [1763990393.8081] device (tapd61a5c3c-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.807 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:4a:b5 10.100.0.9'], port_security=['fa:16:3e:c6:4a:b5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '26455831-8aa8-4319-ae19-8d4c064580c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.808 104225 INFO neutron.agent.ovn.metadata.agent [-] Port d61a5c3c-44c2-4a1b-bb74-6615866e2f1a in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 bound to our chassis
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.809 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:19:53 compute-1 ovn_controller[95368]: 2025-11-24T13:19:53Z|00063|binding|INFO|Setting lport d61a5c3c-44c2-4a1b-bb74-6615866e2f1a ovn-installed in OVS
Nov 24 13:19:53 compute-1 ovn_controller[95368]: 2025-11-24T13:19:53Z|00064|binding|INFO|Setting lport d61a5c3c-44c2-4a1b-bb74-6615866e2f1a up in Southbound
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.816 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.822 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.828 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[850e8ef8-bfbe-4e98-b59a-a37a4e1d54bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:53 compute-1 systemd-machined[153355]: New machine qemu-5-instance-00000005.
Nov 24 13:19:53 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.857 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[eb24a44b-4ddc-4607-a1fb-483dba56b2d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.861 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[6aeab20b-dfc8-410e-b0ff-40f0f0ebb4ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:53 compute-1 sshd-session[209797]: Invalid user steam from 85.209.134.43 port 41896
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.889 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[6251acab-fbc9-4107-bbff-31f625acb0e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:53 compute-1 sshd-session[209797]: Received disconnect from 85.209.134.43 port 41896:11: Bye Bye [preauth]
Nov 24 13:19:53 compute-1 sshd-session[209797]: Disconnected from invalid user steam 85.209.134.43 port 41896 [preauth]
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.906 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8e90d13c-1add-43a4-ae4f-85342b4a568f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209825, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.921 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0a9339-09aa-4616-97d8-f7c7a1fdce29]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333073, 'tstamp': 333073}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209828, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333078, 'tstamp': 333078}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209828, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.923 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.925 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 nova_compute[187078]: 2025-11-24 13:19:53.929 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.929 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.930 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.930 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:19:53.930 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.238 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990394.2380176, 26455831-8aa8-4319-ae19-8d4c064580c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.238 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] VM Resumed (Lifecycle Event)
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.241 187082 DEBUG nova.compute.manager [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.244 187082 INFO nova.virt.libvirt.driver [-] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Instance running successfully.
Nov 24 13:19:54 compute-1 virtqemud[186628]: argument unsupported: QEMU guest agent is not configured
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.247 187082 DEBUG nova.virt.libvirt.guest [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.247 187082 DEBUG nova.virt.libvirt.driver [None req-d6b92e07-156c-42ca-94a4-b58256a92c24 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.265 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.268 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.319 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.320 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990394.2401354, 26455831-8aa8-4319-ae19-8d4c064580c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.321 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] VM Started (Lifecycle Event)
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.366 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.375 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.482 187082 DEBUG nova.network.neutron [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Activated binding for port ced2619b-6589-4d8e-be2b-6abff02aa6a5 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.482 187082 DEBUG nova.compute.manager [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.483 187082 DEBUG nova.virt.libvirt.vif [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:18:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-132711386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-132711386',id=3,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:18:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-p4tqeznp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:19:39Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.484 187082 DEBUG nova.network.os_vif_util [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "address": "fa:16:3e:88:91:48", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapced2619b-65", "ovs_interfaceid": "ced2619b-6589-4d8e-be2b-6abff02aa6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.485 187082 DEBUG nova.network.os_vif_util [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:91:48,bridge_name='br-int',has_traffic_filtering=True,id=ced2619b-6589-4d8e-be2b-6abff02aa6a5,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapced2619b-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.485 187082 DEBUG os_vif [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:91:48,bridge_name='br-int',has_traffic_filtering=True,id=ced2619b-6589-4d8e-be2b-6abff02aa6a5,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapced2619b-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.488 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.489 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapced2619b-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.491 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.493 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.495 187082 INFO os_vif [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:91:48,bridge_name='br-int',has_traffic_filtering=True,id=ced2619b-6589-4d8e-be2b-6abff02aa6a5,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapced2619b-65')
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.496 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.496 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.496 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.496 187082 DEBUG nova.compute.manager [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.497 187082 INFO nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Deleting instance files /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3_del
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.498 187082 INFO nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Deletion of /var/lib/nova/instances/bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3_del complete
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.760 187082 DEBUG nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.761 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.761 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.762 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.762 187082 DEBUG nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.762 187082 WARNING nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state migrating.
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.762 187082 DEBUG nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.762 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.763 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.763 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.763 187082 DEBUG nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.763 187082 DEBUG nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.763 187082 DEBUG nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.764 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.764 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.764 187082 DEBUG oslo_concurrency.lockutils [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.764 187082 DEBUG nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:54 compute-1 nova_compute[187078]: 2025-11-24 13:19:54.765 187082 WARNING nova.compute.manager [req-b947f4df-13a3-4e93-90a5-0280069203c9 req-b02a3fab-acfe-48ce-a0ce-1d1203cfc71a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state migrating.
Nov 24 13:19:55 compute-1 nova_compute[187078]: 2025-11-24 13:19:55.054 187082 DEBUG nova.network.neutron [req-b9b78133-3f97-424c-8b3e-1f7e251f2030 req-e2e6514c-cece-4d55-be38-352f67759b8c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Updated VIF entry in instance network info cache for port d61a5c3c-44c2-4a1b-bb74-6615866e2f1a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:19:55 compute-1 nova_compute[187078]: 2025-11-24 13:19:55.054 187082 DEBUG nova.network.neutron [req-b9b78133-3f97-424c-8b3e-1f7e251f2030 req-e2e6514c-cece-4d55-be38-352f67759b8c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Updating instance_info_cache with network_info: [{"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:19:55 compute-1 nova_compute[187078]: 2025-11-24 13:19:55.070 187082 DEBUG oslo_concurrency.lockutils [req-b9b78133-3f97-424c-8b3e-1f7e251f2030 req-e2e6514c-cece-4d55-be38-352f67759b8c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-26455831-8aa8-4319-ae19-8d4c064580c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.904 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.904 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.904 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.904 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.905 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.905 187082 WARNING nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state migrating.
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.905 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.905 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.905 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.905 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.906 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.906 187082 WARNING nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state migrating.
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.906 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.906 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.906 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.906 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.906 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.907 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-unplugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.907 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.907 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.907 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.907 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.908 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.908 187082 WARNING nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state migrating.
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.908 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received event network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.908 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.908 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.908 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.909 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] No waiting events found dispatching network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.909 187082 WARNING nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received unexpected event network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a for instance with vm_state resized and task_state None.
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.909 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received event network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.909 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.909 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.910 187082 DEBUG oslo_concurrency.lockutils [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.910 187082 DEBUG nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] No waiting events found dispatching network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:56 compute-1 nova_compute[187078]: 2025-11-24 13:19:56.910 187082 WARNING nova.compute.manager [req-d842b106-d32a-4cd3-8f04-87add59fa250 req-698b9747-5954-4b44-b377-694b74e24a3d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received unexpected event network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a for instance with vm_state resized and task_state None.
Nov 24 13:19:57 compute-1 nova_compute[187078]: 2025-11-24 13:19:57.293 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:58 compute-1 podman[209842]: 2025-11-24 13:19:58.535258668 +0000 UTC m=+0.078608154 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 13:19:58 compute-1 podman[209843]: 2025-11-24 13:19:58.570945102 +0000 UTC m=+0.109981490 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.067 187082 DEBUG nova.compute.manager [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.068 187082 DEBUG oslo_concurrency.lockutils [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.068 187082 DEBUG oslo_concurrency.lockutils [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.068 187082 DEBUG oslo_concurrency.lockutils [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.068 187082 DEBUG nova.compute.manager [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.068 187082 WARNING nova.compute.manager [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state migrating.
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.068 187082 DEBUG nova.compute.manager [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.069 187082 DEBUG oslo_concurrency.lockutils [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.069 187082 DEBUG oslo_concurrency.lockutils [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.069 187082 DEBUG oslo_concurrency.lockutils [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.069 187082 DEBUG nova.compute.manager [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] No waiting events found dispatching network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.069 187082 WARNING nova.compute.manager [req-f4110037-c644-4824-b02f-7324c363ea61 req-40c94ce6-1343-4e0e-a9f3-a1953c4e4667 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Received unexpected event network-vif-plugged-ced2619b-6589-4d8e-be2b-6abff02aa6a5 for instance with vm_state active and task_state migrating.
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.491 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.793 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.795 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.795 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.823 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.824 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.824 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.824 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.917 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.994 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:19:59 compute-1 nova_compute[187078]: 2025-11-24 13:19:59.996 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.090 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.097 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.153 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.154 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:00 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Nov 24 13:20:00 compute-1 systemd[209624]: Activating special unit Exit the Session...
Nov 24 13:20:00 compute-1 systemd[209624]: Stopped target Main User Target.
Nov 24 13:20:00 compute-1 systemd[209624]: Stopped target Basic System.
Nov 24 13:20:00 compute-1 systemd[209624]: Stopped target Paths.
Nov 24 13:20:00 compute-1 systemd[209624]: Stopped target Sockets.
Nov 24 13:20:00 compute-1 systemd[209624]: Stopped target Timers.
Nov 24 13:20:00 compute-1 systemd[209624]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:20:00 compute-1 systemd[209624]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:20:00 compute-1 systemd[209624]: Closed D-Bus User Message Bus Socket.
Nov 24 13:20:00 compute-1 systemd[209624]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:20:00 compute-1 systemd[209624]: Removed slice User Application Slice.
Nov 24 13:20:00 compute-1 systemd[209624]: Reached target Shutdown.
Nov 24 13:20:00 compute-1 systemd[209624]: Finished Exit the Session.
Nov 24 13:20:00 compute-1 systemd[209624]: Reached target Exit the Session.
Nov 24 13:20:00 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Nov 24 13:20:00 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Nov 24 13:20:00 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 24 13:20:00 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 24 13:20:00 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 24 13:20:00 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 24 13:20:00 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.249 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.256 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.316 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.317 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.410 187082 DEBUG oslo_concurrency.processutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.644 187082 WARNING nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.648 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5379MB free_disk=73.3791389465332GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.648 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.649 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.699 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration for instance bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.730 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.807 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Instance 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.808 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Instance 26455831-8aa8-4319-ae19-8d4c064580c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.809 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration e32b0866-208a-4d7a-b9a8-d5c102eb761e is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.809 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Instance e949dac9-04e8-4bf5-b73c-32ab3fc59472 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.810 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.810 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.981 187082 DEBUG nova.compute.provider_tree [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:20:00 compute-1 nova_compute[187078]: 2025-11-24 13:20:00.994 187082 DEBUG nova.scheduler.client.report [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:20:01 compute-1 nova_compute[187078]: 2025-11-24 13:20:01.052 187082 DEBUG nova.compute.resource_tracker [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:20:01 compute-1 nova_compute[187078]: 2025-11-24 13:20:01.052 187082 DEBUG oslo_concurrency.lockutils [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:01 compute-1 nova_compute[187078]: 2025-11-24 13:20:01.060 187082 INFO nova.compute.manager [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Nov 24 13:20:01 compute-1 nova_compute[187078]: 2025-11-24 13:20:01.122 187082 INFO nova.scheduler.client.report [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration e32b0866-208a-4d7a-b9a8-d5c102eb761e
Nov 24 13:20:01 compute-1 nova_compute[187078]: 2025-11-24 13:20:01.123 187082 DEBUG nova.virt.libvirt.driver [None req-d3278c56-39e4-4cc1-9f4e-aaf68b4b60a2 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 24 13:20:02 compute-1 nova_compute[187078]: 2025-11-24 13:20:02.304 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:02 compute-1 nova_compute[187078]: 2025-11-24 13:20:02.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:02 compute-1 nova_compute[187078]: 2025-11-24 13:20:02.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:02 compute-1 nova_compute[187078]: 2025-11-24 13:20:02.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:20:02 compute-1 nova_compute[187078]: 2025-11-24 13:20:02.683 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:03 compute-1 nova_compute[187078]: 2025-11-24 13:20:03.690 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:03 compute-1 sshd-session[209906]: Received disconnect from 176.114.89.34 port 48056:11: Bye Bye [preauth]
Nov 24 13:20:03 compute-1 sshd-session[209906]: Disconnected from authenticating user root 176.114.89.34 port 48056 [preauth]
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.147 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.148 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.148 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.493 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.715 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.716 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.716 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.717 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.717 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.718 187082 INFO nova.compute.manager [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Terminating instance
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.719 187082 DEBUG nova.compute.manager [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:20:04 compute-1 kernel: tap871566ba-af (unregistering): left promiscuous mode
Nov 24 13:20:04 compute-1 NetworkManager[55527]: <info>  [1763990404.7506] device (tap871566ba-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:20:04 compute-1 ovn_controller[95368]: 2025-11-24T13:20:04Z|00065|binding|INFO|Releasing lport 871566ba-afd5-437e-944d-0d0e6c395933 from this chassis (sb_readonly=0)
Nov 24 13:20:04 compute-1 ovn_controller[95368]: 2025-11-24T13:20:04Z|00066|binding|INFO|Setting lport 871566ba-afd5-437e-944d-0d0e6c395933 down in Southbound
Nov 24 13:20:04 compute-1 ovn_controller[95368]: 2025-11-24T13:20:04Z|00067|binding|INFO|Removing iface tap871566ba-af ovn-installed in OVS
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.807 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.809 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.822 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.823 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:2e:a1 10.100.0.8'], port_security=['fa:16:3e:4b:2e:a1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=871566ba-afd5-437e-944d-0d0e6c395933) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.826 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 871566ba-afd5-437e-944d-0d0e6c395933 in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 unbound from our chassis
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.829 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.846 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[bd3aa0b3-9cfe-4e45-b43d-e7921fd0106a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:04 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 24 13:20:04 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Consumed 14.466s CPU time.
Nov 24 13:20:04 compute-1 systemd-machined[153355]: Machine qemu-4-instance-00000006 terminated.
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.880 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[5c51f4a1-06ec-4214-807b-b91cea797002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.885 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2d7b0a-04fb-4304-b8d2-275e83ceab9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.931 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[fabb953d-1bbc-423d-8ec5-ae38b50f31a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.943 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.947 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.962 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5260724a-0dc6-480a-b425-2a1e37405492]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 17, 'rx_bytes': 784, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 17, 'rx_bytes': 784, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209922, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.985 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[585234d4-3f99-40d7-92ed-bdf9cc66a93f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333073, 'tstamp': 333073}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209932, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333078, 'tstamp': 333078}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209932, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.987 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.989 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:04 compute-1 nova_compute[187078]: 2025-11-24 13:20:04.993 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.994 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.994 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.995 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:04.995 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.001 187082 INFO nova.virt.libvirt.driver [-] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Instance destroyed successfully.
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.002 187082 DEBUG nova.objects.instance [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'resources' on Instance uuid 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.053 187082 DEBUG nova.virt.libvirt.vif [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1778465145',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1778465145',id=6,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:19:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-t63ko85t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:19:31Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.054 187082 DEBUG nova.network.os_vif_util [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "871566ba-afd5-437e-944d-0d0e6c395933", "address": "fa:16:3e:4b:2e:a1", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap871566ba-af", "ovs_interfaceid": "871566ba-afd5-437e-944d-0d0e6c395933", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.055 187082 DEBUG nova.network.os_vif_util [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:2e:a1,bridge_name='br-int',has_traffic_filtering=True,id=871566ba-afd5-437e-944d-0d0e6c395933,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap871566ba-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.055 187082 DEBUG os_vif [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:2e:a1,bridge_name='br-int',has_traffic_filtering=True,id=871566ba-afd5-437e-944d-0d0e6c395933,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap871566ba-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.057 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.058 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap871566ba-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.060 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.063 187082 INFO os_vif [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:2e:a1,bridge_name='br-int',has_traffic_filtering=True,id=871566ba-afd5-437e-944d-0d0e6c395933,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap871566ba-af')
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.064 187082 INFO nova.virt.libvirt.driver [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Deleting instance files /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a_del
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.065 187082 INFO nova.virt.libvirt.driver [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Deletion of /var/lib/nova/instances/0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a_del complete
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.232 187082 DEBUG nova.virt.libvirt.host [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.234 187082 INFO nova.virt.libvirt.host [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] UEFI support detected
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.237 187082 INFO nova.compute.manager [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Took 0.52 seconds to destroy the instance on the hypervisor.
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.238 187082 DEBUG oslo.service.loopingcall [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.238 187082 DEBUG nova.compute.manager [-] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.238 187082 DEBUG nova.network.neutron [-] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.253 187082 DEBUG nova.compute.manager [req-e648e4d9-a450-43b1-b963-747b0deceb03 req-3e30aba8-503e-4e5d-b342-5a7bb91ebbeb 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received event network-vif-unplugged-871566ba-afd5-437e-944d-0d0e6c395933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.254 187082 DEBUG oslo_concurrency.lockutils [req-e648e4d9-a450-43b1-b963-747b0deceb03 req-3e30aba8-503e-4e5d-b342-5a7bb91ebbeb 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.254 187082 DEBUG oslo_concurrency.lockutils [req-e648e4d9-a450-43b1-b963-747b0deceb03 req-3e30aba8-503e-4e5d-b342-5a7bb91ebbeb 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.254 187082 DEBUG oslo_concurrency.lockutils [req-e648e4d9-a450-43b1-b963-747b0deceb03 req-3e30aba8-503e-4e5d-b342-5a7bb91ebbeb 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.255 187082 DEBUG nova.compute.manager [req-e648e4d9-a450-43b1-b963-747b0deceb03 req-3e30aba8-503e-4e5d-b342-5a7bb91ebbeb 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] No waiting events found dispatching network-vif-unplugged-871566ba-afd5-437e-944d-0d0e6c395933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.255 187082 DEBUG nova.compute.manager [req-e648e4d9-a450-43b1-b963-747b0deceb03 req-3e30aba8-503e-4e5d-b342-5a7bb91ebbeb 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received event network-vif-unplugged-871566ba-afd5-437e-944d-0d0e6c395933 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:20:05 compute-1 podman[197429]: time="2025-11-24T13:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:20:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:20:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3047 "" "Go-http-client/1.1"
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.672 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.672 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:05 compute-1 nova_compute[187078]: 2025-11-24 13:20:05.672 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.665 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.689 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.689 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.794 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:06 compute-1 podman[209951]: 2025-11-24 13:20:06.800003147 +0000 UTC m=+0.089655896 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.865 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.867 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.891 187082 DEBUG nova.network.neutron [-] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.910 187082 INFO nova.compute.manager [-] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Took 1.67 seconds to deallocate network for instance.
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.926 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.931 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.970 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.971 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:06 compute-1 nova_compute[187078]: 2025-11-24 13:20:06.973 187082 DEBUG nova.compute.manager [req-bc6ef732-fd6b-4ea8-8cf7-3476bef6ec27 req-c5d10e85-bb8f-4550-9077-d479afb3e7fd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received event network-vif-deleted-871566ba-afd5-437e-944d-0d0e6c395933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.019 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.020 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.089 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.110 187082 DEBUG nova.compute.provider_tree [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.131 187082 DEBUG nova.scheduler.client.report [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.153 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.183 187082 INFO nova.scheduler.client.report [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Deleted allocations for instance 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.276 187082 DEBUG oslo_concurrency.lockutils [None req-d860af7d-80e1-4b8b-9443-c36d20940cfd bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.303 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.313 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.313 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5468MB free_disk=73.40827560424805GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.314 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.314 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.335 187082 DEBUG nova.compute.manager [req-ac94db29-abe8-4b98-a913-a0cb700f5b76 req-46daa091-13c0-4ea1-acce-1cbf103273ae 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received event network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.335 187082 DEBUG oslo_concurrency.lockutils [req-ac94db29-abe8-4b98-a913-a0cb700f5b76 req-46daa091-13c0-4ea1-acce-1cbf103273ae 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.335 187082 DEBUG oslo_concurrency.lockutils [req-ac94db29-abe8-4b98-a913-a0cb700f5b76 req-46daa091-13c0-4ea1-acce-1cbf103273ae 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.335 187082 DEBUG oslo_concurrency.lockutils [req-ac94db29-abe8-4b98-a913-a0cb700f5b76 req-46daa091-13c0-4ea1-acce-1cbf103273ae 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.336 187082 DEBUG nova.compute.manager [req-ac94db29-abe8-4b98-a913-a0cb700f5b76 req-46daa091-13c0-4ea1-acce-1cbf103273ae 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] No waiting events found dispatching network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.336 187082 WARNING nova.compute.manager [req-ac94db29-abe8-4b98-a913-a0cb700f5b76 req-46daa091-13c0-4ea1-acce-1cbf103273ae 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Received unexpected event network-vif-plugged-871566ba-afd5-437e-944d-0d0e6c395933 for instance with vm_state deleted and task_state None.
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.375 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance e949dac9-04e8-4bf5-b73c-32ab3fc59472 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.375 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 26455831-8aa8-4319-ae19-8d4c064580c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.375 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.375 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.440 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.457 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.488 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.489 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.539 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763990392.53335, bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.539 187082 INFO nova.compute.manager [-] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] VM Stopped (Lifecycle Event)
Nov 24 13:20:07 compute-1 nova_compute[187078]: 2025-11-24 13:20:07.565 187082 DEBUG nova.compute.manager [None req-43f21700-e39d-45bd-be65-4b0ddd99b009 - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.161 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.161 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.162 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.162 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.162 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.163 187082 INFO nova.compute.manager [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Terminating instance
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.164 187082 DEBUG nova.compute.manager [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:20:08 compute-1 kernel: tapd61a5c3c-44 (unregistering): left promiscuous mode
Nov 24 13:20:08 compute-1 NetworkManager[55527]: <info>  [1763990408.1920] device (tapd61a5c3c-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:20:08 compute-1 ovn_controller[95368]: 2025-11-24T13:20:08Z|00068|binding|INFO|Releasing lport d61a5c3c-44c2-4a1b-bb74-6615866e2f1a from this chassis (sb_readonly=0)
Nov 24 13:20:08 compute-1 ovn_controller[95368]: 2025-11-24T13:20:08Z|00069|binding|INFO|Setting lport d61a5c3c-44c2-4a1b-bb74-6615866e2f1a down in Southbound
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.200 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:08 compute-1 ovn_controller[95368]: 2025-11-24T13:20:08Z|00070|binding|INFO|Removing iface tapd61a5c3c-44 ovn-installed in OVS
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.206 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:4a:b5 10.100.0.9'], port_security=['fa:16:3e:c6:4a:b5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '26455831-8aa8-4319-ae19-8d4c064580c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.208 104225 INFO neutron.agent.ovn.metadata.agent [-] Port d61a5c3c-44c2-4a1b-bb74-6615866e2f1a in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 unbound from our chassis
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.209 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 173735b5-05cb-4490-be96-4caf1fa864d7
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.219 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.224 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[39862bfb-40f4-4ab7-b0a2-1fc4cc7ca0ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.258 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[85d85027-7d9b-4660-9207-1ace80143fc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.261 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[4c5f39cc-9268-43a8-9f42-9e1f310e1d4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:08 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 24 13:20:08 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 12.927s CPU time.
Nov 24 13:20:08 compute-1 systemd-machined[153355]: Machine qemu-5-instance-00000005 terminated.
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.292 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb9d983-d2a8-4469-819a-ed54ac3f4040]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.314 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[106260e6-534d-4f44-bf20-e91e7e0b01d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap173735b5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:aa:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 19, 'rx_bytes': 784, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 19, 'rx_bytes': 784, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333055, 'reachable_time': 15959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209993, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.332 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8df774f5-8eba-4b4c-a29c-f1db092d6294]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333073, 'tstamp': 333073}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209994, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap173735b5-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333078, 'tstamp': 333078}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209994, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.334 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.335 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.341 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.341 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap173735b5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.341 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.342 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap173735b5-00, col_values=(('external_ids', {'iface-id': '05d2a163-89ad-4be0-a5cd-d2951a560cf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:08.342 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.434 187082 INFO nova.virt.libvirt.driver [-] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Instance destroyed successfully.
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.435 187082 DEBUG nova.objects.instance [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'resources' on Instance uuid 26455831-8aa8-4319-ae19-8d4c064580c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.446 187082 DEBUG nova.virt.libvirt.vif [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:19:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1452138830',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1452138830',id=5,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:19:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-hf1m3cwa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:19:59Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=26455831-8aa8-4319-ae19-8d4c064580c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.447 187082 DEBUG nova.network.os_vif_util [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "address": "fa:16:3e:c6:4a:b5", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61a5c3c-44", "ovs_interfaceid": "d61a5c3c-44c2-4a1b-bb74-6615866e2f1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.447 187082 DEBUG nova.network.os_vif_util [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c6:4a:b5,bridge_name='br-int',has_traffic_filtering=True,id=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd61a5c3c-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.448 187082 DEBUG os_vif [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:4a:b5,bridge_name='br-int',has_traffic_filtering=True,id=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd61a5c3c-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.449 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.450 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd61a5c3c-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.452 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.453 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.455 187082 INFO os_vif [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:4a:b5,bridge_name='br-int',has_traffic_filtering=True,id=d61a5c3c-44c2-4a1b-bb74-6615866e2f1a,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd61a5c3c-44')
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.456 187082 INFO nova.virt.libvirt.driver [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Deleting instance files /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2_del
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.461 187082 INFO nova.virt.libvirt.driver [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Deletion of /var/lib/nova/instances/26455831-8aa8-4319-ae19-8d4c064580c2_del complete
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.483 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.507 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.507 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.512 187082 INFO nova.compute.manager [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.513 187082 DEBUG oslo.service.loopingcall [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.513 187082 DEBUG nova.compute.manager [-] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.513 187082 DEBUG nova.network.neutron [-] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.523 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: bc8c3f5d-d27d-43e5-9e2f-1652b40afcc3] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.524 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:20:08 compute-1 nova_compute[187078]: 2025-11-24 13:20:08.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.225 187082 DEBUG nova.network.neutron [-] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.241 187082 INFO nova.compute.manager [-] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Took 0.73 seconds to deallocate network for instance.
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.281 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.282 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.289 187082 DEBUG nova.compute.manager [req-599cebb8-2aeb-48fa-b9cd-5bf4bdcbfe51 req-56ef6cf4-322a-463f-b534-e00994de846c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received event network-vif-deleted-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.388 187082 DEBUG nova.compute.provider_tree [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.402 187082 DEBUG nova.scheduler.client.report [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.427 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.441 187082 DEBUG nova.compute.manager [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received event network-vif-unplugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.441 187082 DEBUG oslo_concurrency.lockutils [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.442 187082 DEBUG oslo_concurrency.lockutils [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.442 187082 DEBUG oslo_concurrency.lockutils [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.443 187082 DEBUG nova.compute.manager [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] No waiting events found dispatching network-vif-unplugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.443 187082 WARNING nova.compute.manager [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received unexpected event network-vif-unplugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a for instance with vm_state deleted and task_state None.
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.443 187082 DEBUG nova.compute.manager [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received event network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.444 187082 DEBUG oslo_concurrency.lockutils [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.444 187082 DEBUG oslo_concurrency.lockutils [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.444 187082 DEBUG oslo_concurrency.lockutils [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.445 187082 DEBUG nova.compute.manager [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] No waiting events found dispatching network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.445 187082 WARNING nova.compute.manager [req-6fdb299d-2269-4405-907f-cba0c0b9eebd req-ae2155ef-ed4b-4400-ac0d-e87345fecb07 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Received unexpected event network-vif-plugged-d61a5c3c-44c2-4a1b-bb74-6615866e2f1a for instance with vm_state deleted and task_state None.
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.448 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.454 187082 WARNING nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.455 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Triggering sync for uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.455 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Triggering sync for uuid 26455831-8aa8-4319-ae19-8d4c064580c2 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.455 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.456 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.457 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "26455831-8aa8-4319-ae19-8d4c064580c2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.480 187082 INFO nova.scheduler.client.report [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Deleted allocations for instance 26455831-8aa8-4319-ae19-8d4c064580c2
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.487 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.539 187082 DEBUG oslo_concurrency.lockutils [None req-df656733-a86d-410e-8972-c44158c53ac0 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "26455831-8aa8-4319-ae19-8d4c064580c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.540 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "26455831-8aa8-4319-ae19-8d4c064580c2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:09 compute-1 nova_compute[187078]: 2025-11-24 13:20:09.559 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "26455831-8aa8-4319-ae19-8d4c064580c2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.019s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:10 compute-1 nova_compute[187078]: 2025-11-24 13:20:10.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:20:10 compute-1 nova_compute[187078]: 2025-11-24 13:20:10.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:20:10 compute-1 nova_compute[187078]: 2025-11-24 13:20:10.686 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:20:11 compute-1 sshd-session[210013]: Received disconnect from 68.183.82.237 port 60832:11: Bye Bye [preauth]
Nov 24 13:20:11 compute-1 sshd-session[210013]: Disconnected from authenticating user root 68.183.82.237 port 60832 [preauth]
Nov 24 13:20:12 compute-1 nova_compute[187078]: 2025-11-24 13:20:12.305 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:13 compute-1 nova_compute[187078]: 2025-11-24 13:20:13.452 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:17 compute-1 nova_compute[187078]: 2025-11-24 13:20:17.307 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:18 compute-1 nova_compute[187078]: 2025-11-24 13:20:18.453 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:19 compute-1 openstack_network_exporter[199599]: ERROR   13:20:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:20:19 compute-1 openstack_network_exporter[199599]: ERROR   13:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:20:19 compute-1 openstack_network_exporter[199599]: ERROR   13:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:20:19 compute-1 openstack_network_exporter[199599]: ERROR   13:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:20:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:20:19 compute-1 openstack_network_exporter[199599]: ERROR   13:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:20:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.001 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763990404.9974012, 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.002 187082 INFO nova.compute.manager [-] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] VM Stopped (Lifecycle Event)
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.038 187082 DEBUG nova.compute.manager [None req-a4afae79-d663-48bd-bbe1-b54b5502728a - - - - - -] [instance: 0ce9caa3-c8fa-4aa5-9c64-3d4dab191b6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.512 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.513 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.513 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.513 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.513 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.515 187082 INFO nova.compute.manager [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Terminating instance
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.516 187082 DEBUG nova.compute.manager [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:20:20 compute-1 kernel: tap3c9ecd74-5b (unregistering): left promiscuous mode
Nov 24 13:20:20 compute-1 NetworkManager[55527]: <info>  [1763990420.5415] device (tap3c9ecd74-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00071|binding|INFO|Releasing lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e from this chassis (sb_readonly=0)
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00072|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e down in Southbound
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00073|binding|INFO|Removing iface tap3c9ecd74-5b ovn-installed in OVS
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.552 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.560 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:b3:4c 10.100.0.13'], port_security=['fa:16:3e:10:b3:4c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e949dac9-04e8-4bf5-b73c-32ab3fc59472', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.561 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 unbound from our chassis
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.563 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 173735b5-05cb-4490-be96-4caf1fa864d7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.564 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[fb89ad42-a7c7-4783-a1a3-84eb365a2c98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.564 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 namespace which is not needed anymore
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.570 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 24 13:20:20 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Consumed 17.194s CPU time.
Nov 24 13:20:20 compute-1 systemd-machined[153355]: Machine qemu-2-instance-00000001 terminated.
Nov 24 13:20:20 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[209159]: [NOTICE]   (209163) : haproxy version is 2.8.14-c23fe91
Nov 24 13:20:20 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[209159]: [NOTICE]   (209163) : path to executable is /usr/sbin/haproxy
Nov 24 13:20:20 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[209159]: [WARNING]  (209163) : Exiting Master process...
Nov 24 13:20:20 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[209159]: [ALERT]    (209163) : Current worker (209165) exited with code 143 (Terminated)
Nov 24 13:20:20 compute-1 neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7[209159]: [WARNING]  (209163) : All workers exited. Exiting... (0)
Nov 24 13:20:20 compute-1 systemd[1]: libpod-77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4.scope: Deactivated successfully.
Nov 24 13:20:20 compute-1 podman[210036]: 2025-11-24 13:20:20.706781748 +0000 UTC m=+0.043466867 container died 77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:20:20 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4-userdata-shm.mount: Deactivated successfully.
Nov 24 13:20:20 compute-1 systemd[1]: var-lib-containers-storage-overlay-be0d56d3b480e174b2c1a9cbe335fc80027f5f897ea4c68e39977dbe6921c3af-merged.mount: Deactivated successfully.
Nov 24 13:20:20 compute-1 kernel: tap3c9ecd74-5b: entered promiscuous mode
Nov 24 13:20:20 compute-1 NetworkManager[55527]: <info>  [1763990420.7420] manager: (tap3c9ecd74-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Nov 24 13:20:20 compute-1 systemd-udevd[210020]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:20:20 compute-1 kernel: tap3c9ecd74-5b (unregistering): left promiscuous mode
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00074|binding|INFO|Claiming lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for this chassis.
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00075|binding|INFO|3c9ecd74-5bbd-4ab3-ad59-929239c5a81e: Claiming fa:16:3e:10:b3:4c 10.100.0.13
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.743 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.755 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:b3:4c 10.100.0.13'], port_security=['fa:16:3e:10:b3:4c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e949dac9-04e8-4bf5-b73c-32ab3fc59472', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:20:20 compute-1 podman[210036]: 2025-11-24 13:20:20.75820331 +0000 UTC m=+0.094888429 container cleanup 77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:20:20 compute-1 systemd[1]: libpod-conmon-77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4.scope: Deactivated successfully.
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00076|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e ovn-installed in OVS
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00077|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e up in Southbound
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00078|binding|INFO|Releasing lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e from this chassis (sb_readonly=1)
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00079|if_status|INFO|Dropped 2 log messages in last 28 seconds (most recently, 28 seconds ago) due to excessive rate
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00080|if_status|INFO|Not setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e down as sb is readonly
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.774 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00081|binding|INFO|Removing iface tap3c9ecd74-5b ovn-installed in OVS
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00082|binding|INFO|Releasing lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e from this chassis (sb_readonly=1)
Nov 24 13:20:20 compute-1 ovn_controller[95368]: 2025-11-24T13:20:20Z|00083|binding|INFO|Setting lport 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e down in Southbound
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.784 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:b3:4c 10.100.0.13'], port_security=['fa:16:3e:10:b3:4c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e949dac9-04e8-4bf5-b73c-32ab3fc59472', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-173735b5-05cb-4490-be96-4caf1fa864d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5383ea8abbd144f89d959d9b1f9c052f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '594378bf-26a1-4a87-a303-601e00a5dcf8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=543fb748-92ef-48f8-b381-5d8e0ce7c890, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.786 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.797 187082 INFO nova.virt.libvirt.driver [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Instance destroyed successfully.
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.798 187082 DEBUG nova.objects.instance [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lazy-loading 'resources' on Instance uuid e949dac9-04e8-4bf5-b73c-32ab3fc59472 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.807 187082 DEBUG nova.virt.libvirt.vif [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1684048061',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1684048061',id=1,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:18:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5383ea8abbd144f89d959d9b1f9c052f',ramdisk_id='',reservation_id='r-yt51c9yn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1613857129',owner_user_name='tempest-TestExecuteActionsViaActuator-1613857129-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:18:38Z,user_data=None,user_id='bad71b4865594b828cc87e37a3107bc4',uuid=e949dac9-04e8-4bf5-b73c-32ab3fc59472,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.808 187082 DEBUG nova.network.os_vif_util [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converting VIF {"id": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "address": "fa:16:3e:10:b3:4c", "network": {"id": "173735b5-05cb-4490-be96-4caf1fa864d7", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-942574064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5383ea8abbd144f89d959d9b1f9c052f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c9ecd74-5b", "ovs_interfaceid": "3c9ecd74-5bbd-4ab3-ad59-929239c5a81e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.808 187082 DEBUG nova.network.os_vif_util [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.809 187082 DEBUG os_vif [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.810 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.810 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c9ecd74-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.812 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.813 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.815 187082 INFO os_vif [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:b3:4c,bridge_name='br-int',has_traffic_filtering=True,id=3c9ecd74-5bbd-4ab3-ad59-929239c5a81e,network=Network(173735b5-05cb-4490-be96-4caf1fa864d7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c9ecd74-5b')
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.815 187082 INFO nova.virt.libvirt.driver [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Deleting instance files /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472_del
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.816 187082 INFO nova.virt.libvirt.driver [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Deletion of /var/lib/nova/instances/e949dac9-04e8-4bf5-b73c-32ab3fc59472_del complete
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.852 187082 INFO nova.compute.manager [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.852 187082 DEBUG oslo.service.loopingcall [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.853 187082 DEBUG nova.compute.manager [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.853 187082 DEBUG nova.network.neutron [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:20:20 compute-1 podman[210070]: 2025-11-24 13:20:20.875242902 +0000 UTC m=+0.093354107 container remove 77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.881 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0833f8-88d6-429d-9d57-ab0858bf94a7]: (4, ('Mon Nov 24 01:20:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 (77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4)\n77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4\nMon Nov 24 01:20:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 (77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4)\n77a227736af99fb211a34c90e05adb16b79a862c816ee00e9df75f7e1f709ee4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.884 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0b7869b8-53f4-4c72-8f62-a7c5fc396008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.885 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap173735b5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.886 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 kernel: tap173735b5-00: left promiscuous mode
Nov 24 13:20:20 compute-1 nova_compute[187078]: 2025-11-24 13:20:20.903 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.908 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[82b77d6f-3e7d-4a3f-8a4d-55bb5e029b90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.926 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4936aa1e-d933-42be-a6f6-aea5c36ba6c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.928 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[79fea6a2-26a8-4621-ba9c-f3f6fef65f1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.951 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[d76247e0-5dd9-44c9-96e2-de4cb6f0ccaa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333045, 'reachable_time': 27322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210094, 'error': None, 'target': 'ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.955 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-173735b5-05cb-4490-be96-4caf1fa864d7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.955 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[ea9109f4-4ff6-4bf9-87f9-f051571efa29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 systemd[1]: run-netns-ovnmeta\x2d173735b5\x2d05cb\x2d4490\x2dbe96\x2d4caf1fa864d7.mount: Deactivated successfully.
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.956 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 unbound from our chassis
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.957 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 173735b5-05cb-4490-be96-4caf1fa864d7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.958 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[d13b007a-d4e8-4a82-b384-5389d834c6dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.959 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 3c9ecd74-5bbd-4ab3-ad59-929239c5a81e in datapath 173735b5-05cb-4490-be96-4caf1fa864d7 unbound from our chassis
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.960 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 173735b5-05cb-4490-be96-4caf1fa864d7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:20:20 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:20.960 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ab872738-2161-421e-8e12-9b2138d909bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.254 187082 DEBUG nova.compute.manager [req-2d2ccde6-c613-4429-8eec-06a22c6d3034 req-7d9d580f-f32c-4f70-b9c8-492d326fe2a5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-unplugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.254 187082 DEBUG oslo_concurrency.lockutils [req-2d2ccde6-c613-4429-8eec-06a22c6d3034 req-7d9d580f-f32c-4f70-b9c8-492d326fe2a5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.255 187082 DEBUG oslo_concurrency.lockutils [req-2d2ccde6-c613-4429-8eec-06a22c6d3034 req-7d9d580f-f32c-4f70-b9c8-492d326fe2a5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.255 187082 DEBUG oslo_concurrency.lockutils [req-2d2ccde6-c613-4429-8eec-06a22c6d3034 req-7d9d580f-f32c-4f70-b9c8-492d326fe2a5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.255 187082 DEBUG nova.compute.manager [req-2d2ccde6-c613-4429-8eec-06a22c6d3034 req-7d9d580f-f32c-4f70-b9c8-492d326fe2a5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] No waiting events found dispatching network-vif-unplugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.256 187082 DEBUG nova.compute.manager [req-2d2ccde6-c613-4429-8eec-06a22c6d3034 req-7d9d580f-f32c-4f70-b9c8-492d326fe2a5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-unplugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.480 187082 DEBUG nova.network.neutron [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.613 187082 INFO nova.compute.manager [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Took 0.76 seconds to deallocate network for instance.
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.646 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.646 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.698 187082 DEBUG nova.compute.provider_tree [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.715 187082 DEBUG nova.scheduler.client.report [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.745 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.775 187082 INFO nova.scheduler.client.report [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Deleted allocations for instance e949dac9-04e8-4bf5-b73c-32ab3fc59472
Nov 24 13:20:21 compute-1 nova_compute[187078]: 2025-11-24 13:20:21.840 187082 DEBUG oslo_concurrency.lockutils [None req-f994a3d0-f525-412b-a041-20d503e51120 bad71b4865594b828cc87e37a3107bc4 5383ea8abbd144f89d959d9b1f9c052f - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:22 compute-1 nova_compute[187078]: 2025-11-24 13:20:22.310 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:22 compute-1 podman[210095]: 2025-11-24 13:20:22.531161307 +0000 UTC m=+0.069377123 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.332 187082 DEBUG nova.compute.manager [req-2dcd648a-62f2-42c1-b959-de72e37f39c5 req-a2a2bdcb-0cb5-4920-aca3-45847cfc7fdc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.333 187082 DEBUG oslo_concurrency.lockutils [req-2dcd648a-62f2-42c1-b959-de72e37f39c5 req-a2a2bdcb-0cb5-4920-aca3-45847cfc7fdc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.333 187082 DEBUG oslo_concurrency.lockutils [req-2dcd648a-62f2-42c1-b959-de72e37f39c5 req-a2a2bdcb-0cb5-4920-aca3-45847cfc7fdc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.333 187082 DEBUG oslo_concurrency.lockutils [req-2dcd648a-62f2-42c1-b959-de72e37f39c5 req-a2a2bdcb-0cb5-4920-aca3-45847cfc7fdc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "e949dac9-04e8-4bf5-b73c-32ab3fc59472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.333 187082 DEBUG nova.compute.manager [req-2dcd648a-62f2-42c1-b959-de72e37f39c5 req-a2a2bdcb-0cb5-4920-aca3-45847cfc7fdc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] No waiting events found dispatching network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.333 187082 WARNING nova.compute.manager [req-2dcd648a-62f2-42c1-b959-de72e37f39c5 req-a2a2bdcb-0cb5-4920-aca3-45847cfc7fdc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received unexpected event network-vif-plugged-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e for instance with vm_state deleted and task_state None.
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.333 187082 DEBUG nova.compute.manager [req-2dcd648a-62f2-42c1-b959-de72e37f39c5 req-a2a2bdcb-0cb5-4920-aca3-45847cfc7fdc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Received event network-vif-deleted-3c9ecd74-5bbd-4ab3-ad59-929239c5a81e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.433 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763990408.4325593, 26455831-8aa8-4319-ae19-8d4c064580c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.434 187082 INFO nova.compute.manager [-] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] VM Stopped (Lifecycle Event)
Nov 24 13:20:23 compute-1 nova_compute[187078]: 2025-11-24 13:20:23.452 187082 DEBUG nova.compute.manager [None req-483af229-5f77-4797-bedb-63d281587e98 - - - - - -] [instance: 26455831-8aa8-4319-ae19-8d4c064580c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:20:24 compute-1 podman[210119]: 2025-11-24 13:20:24.508775775 +0000 UTC m=+0.052673328 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 13:20:25 compute-1 nova_compute[187078]: 2025-11-24 13:20:25.812 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:26 compute-1 nova_compute[187078]: 2025-11-24 13:20:26.540 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:27 compute-1 nova_compute[187078]: 2025-11-24 13:20:27.312 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:29 compute-1 podman[210138]: 2025-11-24 13:20:29.551883724 +0000 UTC m=+0.087735573 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 13:20:29 compute-1 podman[210139]: 2025-11-24 13:20:29.553609402 +0000 UTC m=+0.091998130 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 24 13:20:30 compute-1 nova_compute[187078]: 2025-11-24 13:20:30.813 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:31 compute-1 sshd-session[210182]: Invalid user ubuntu from 61.240.213.113 port 33728
Nov 24 13:20:31 compute-1 sshd-session[210182]: Received disconnect from 61.240.213.113 port 33728:11:  [preauth]
Nov 24 13:20:31 compute-1 sshd-session[210182]: Disconnected from invalid user ubuntu 61.240.213.113 port 33728 [preauth]
Nov 24 13:20:32 compute-1 nova_compute[187078]: 2025-11-24 13:20:32.314 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:33 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:33.061 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:20:33 compute-1 nova_compute[187078]: 2025-11-24 13:20:33.062 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:33 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:33.063 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:20:34 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:20:34.066 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:20:35 compute-1 podman[197429]: time="2025-11-24T13:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:20:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:20:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Nov 24 13:20:35 compute-1 nova_compute[187078]: 2025-11-24 13:20:35.797 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763990420.7957506, e949dac9-04e8-4bf5-b73c-32ab3fc59472 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:20:35 compute-1 nova_compute[187078]: 2025-11-24 13:20:35.797 187082 INFO nova.compute.manager [-] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] VM Stopped (Lifecycle Event)
Nov 24 13:20:35 compute-1 nova_compute[187078]: 2025-11-24 13:20:35.814 187082 DEBUG nova.compute.manager [None req-cb75927d-f6d9-46d5-80fd-9b0ddab02715 - - - - - -] [instance: e949dac9-04e8-4bf5-b73c-32ab3fc59472] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:20:35 compute-1 nova_compute[187078]: 2025-11-24 13:20:35.815 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:37 compute-1 nova_compute[187078]: 2025-11-24 13:20:37.316 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:37 compute-1 podman[210184]: 2025-11-24 13:20:37.529929186 +0000 UTC m=+0.075851009 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 24 13:20:40 compute-1 nova_compute[187078]: 2025-11-24 13:20:40.816 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:42 compute-1 nova_compute[187078]: 2025-11-24 13:20:42.318 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:45 compute-1 nova_compute[187078]: 2025-11-24 13:20:45.818 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:47 compute-1 sshd-session[210206]: Invalid user solana from 193.32.162.145 port 56096
Nov 24 13:20:47 compute-1 sshd-session[210206]: Connection closed by invalid user solana 193.32.162.145 port 56096 [preauth]
Nov 24 13:20:47 compute-1 nova_compute[187078]: 2025-11-24 13:20:47.320 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:48 compute-1 sshd-session[210208]: Invalid user azureuser from 5.198.176.28 port 43696
Nov 24 13:20:48 compute-1 sshd-session[210208]: Received disconnect from 5.198.176.28 port 43696:11: Bye Bye [preauth]
Nov 24 13:20:48 compute-1 sshd-session[210208]: Disconnected from invalid user azureuser 5.198.176.28 port 43696 [preauth]
Nov 24 13:20:49 compute-1 openstack_network_exporter[199599]: ERROR   13:20:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:20:49 compute-1 openstack_network_exporter[199599]: ERROR   13:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:20:49 compute-1 openstack_network_exporter[199599]: ERROR   13:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:20:49 compute-1 openstack_network_exporter[199599]: ERROR   13:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:20:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:20:49 compute-1 openstack_network_exporter[199599]: ERROR   13:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:20:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:20:50 compute-1 nova_compute[187078]: 2025-11-24 13:20:50.820 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:52 compute-1 nova_compute[187078]: 2025-11-24 13:20:52.321 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:53 compute-1 podman[210210]: 2025-11-24 13:20:53.534767759 +0000 UTC m=+0.065670682 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:20:54 compute-1 sshd-session[210234]: Invalid user bitrix from 85.209.134.43 port 58922
Nov 24 13:20:54 compute-1 sshd-session[210234]: Received disconnect from 85.209.134.43 port 58922:11: Bye Bye [preauth]
Nov 24 13:20:54 compute-1 sshd-session[210234]: Disconnected from invalid user bitrix 85.209.134.43 port 58922 [preauth]
Nov 24 13:20:54 compute-1 podman[210236]: 2025-11-24 13:20:54.988635804 +0000 UTC m=+0.051008592 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 13:20:55 compute-1 nova_compute[187078]: 2025-11-24 13:20:55.822 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:57 compute-1 nova_compute[187078]: 2025-11-24 13:20:57.323 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:20:59 compute-1 nova_compute[187078]: 2025-11-24 13:20:59.814 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:59 compute-1 nova_compute[187078]: 2025-11-24 13:20:59.814 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:59 compute-1 nova_compute[187078]: 2025-11-24 13:20:59.825 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:20:59 compute-1 nova_compute[187078]: 2025-11-24 13:20:59.890 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:20:59 compute-1 nova_compute[187078]: 2025-11-24 13:20:59.891 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:20:59 compute-1 nova_compute[187078]: 2025-11-24 13:20:59.899 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:20:59 compute-1 nova_compute[187078]: 2025-11-24 13:20:59.899 187082 INFO nova.compute.claims [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:20:59 compute-1 nova_compute[187078]: 2025-11-24 13:20:59.991 187082 DEBUG nova.compute.provider_tree [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.007 187082 DEBUG nova.scheduler.client.report [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.034 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.036 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.079 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.080 187082 DEBUG nova.network.neutron [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.095 187082 INFO nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.111 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.191 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.193 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.193 187082 INFO nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Creating image(s)
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.193 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Acquiring lock "/var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.194 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "/var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.194 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "/var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.206 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.262 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.264 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.264 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.274 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.328 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.329 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.377 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.378 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.378 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.450 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.451 187082 DEBUG nova.virt.disk.api [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Checking if we can resize image /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.451 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:00 compute-1 podman[210266]: 2025-11-24 13:21:00.508166534 +0000 UTC m=+0.058114525 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.509 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.510 187082 DEBUG nova.virt.disk.api [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Cannot resize image /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.511 187082 DEBUG nova.objects.instance [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lazy-loading 'migration_context' on Instance uuid df72ea7f-d1a4-4c43-8112-40abfc528851 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.523 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.523 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Ensure instance console log exists: /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.524 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.524 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.524 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:00 compute-1 podman[210267]: 2025-11-24 13:21:00.535908811 +0000 UTC m=+0.081986707 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:21:00 compute-1 ovn_controller[95368]: 2025-11-24T13:21:00Z|00084|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.823 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:00 compute-1 nova_compute[187078]: 2025-11-24 13:21:00.918 187082 DEBUG nova.policy [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2260d70dd3d4ff8af8a70855d4981be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5351147be4ee48e9b79d613fbb862cef', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:21:02 compute-1 nova_compute[187078]: 2025-11-24 13:21:02.325 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:02 compute-1 nova_compute[187078]: 2025-11-24 13:21:02.442 187082 DEBUG nova.network.neutron [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Successfully created port: eca02106-5d92-46e8-8a0a-51addd985c5f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:21:02 compute-1 nova_compute[187078]: 2025-11-24 13:21:02.687 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:02 compute-1 nova_compute[187078]: 2025-11-24 13:21:02.994 187082 DEBUG nova.network.neutron [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Successfully updated port: eca02106-5d92-46e8-8a0a-51addd985c5f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:21:03 compute-1 nova_compute[187078]: 2025-11-24 13:21:03.007 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Acquiring lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:21:03 compute-1 nova_compute[187078]: 2025-11-24 13:21:03.007 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Acquired lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:21:03 compute-1 nova_compute[187078]: 2025-11-24 13:21:03.008 187082 DEBUG nova.network.neutron [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:21:03 compute-1 nova_compute[187078]: 2025-11-24 13:21:03.094 187082 DEBUG nova.compute.manager [req-46bed593-bc3f-484d-bb72-d2cc7b7fbf80 req-5bb61596-747f-44d5-acb6-02c505704020 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-changed-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:21:03 compute-1 nova_compute[187078]: 2025-11-24 13:21:03.095 187082 DEBUG nova.compute.manager [req-46bed593-bc3f-484d-bb72-d2cc7b7fbf80 req-5bb61596-747f-44d5-acb6-02c505704020 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Refreshing instance network info cache due to event network-changed-eca02106-5d92-46e8-8a0a-51addd985c5f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:21:03 compute-1 nova_compute[187078]: 2025-11-24 13:21:03.095 187082 DEBUG oslo_concurrency.lockutils [req-46bed593-bc3f-484d-bb72-d2cc7b7fbf80 req-5bb61596-747f-44d5-acb6-02c505704020 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:21:03 compute-1 nova_compute[187078]: 2025-11-24 13:21:03.140 187082 DEBUG nova.network.neutron [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:21:03 compute-1 sshd-session[210318]: Invalid user sol from 45.148.10.240 port 37088
Nov 24 13:21:04 compute-1 sshd-session[210318]: Connection closed by invalid user sol 45.148.10.240 port 37088 [preauth]
Nov 24 13:21:04 compute-1 sshd-session[210316]: Received disconnect from 175.100.24.139 port 49946:11: Bye Bye [preauth]
Nov 24 13:21:04 compute-1 sshd-session[210316]: Disconnected from authenticating user root 175.100.24.139 port 49946 [preauth]
Nov 24 13:21:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:04.148 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:04.148 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:04.149 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.087 187082 DEBUG nova.network.neutron [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updating instance_info_cache with network_info: [{"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.101 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Releasing lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.102 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Instance network_info: |[{"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.103 187082 DEBUG oslo_concurrency.lockutils [req-46bed593-bc3f-484d-bb72-d2cc7b7fbf80 req-5bb61596-747f-44d5-acb6-02c505704020 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.103 187082 DEBUG nova.network.neutron [req-46bed593-bc3f-484d-bb72-d2cc7b7fbf80 req-5bb61596-747f-44d5-acb6-02c505704020 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Refreshing network info cache for port eca02106-5d92-46e8-8a0a-51addd985c5f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.109 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Start _get_guest_xml network_info=[{"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.116 187082 WARNING nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.127 187082 DEBUG nova.virt.libvirt.host [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.128 187082 DEBUG nova.virt.libvirt.host [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.132 187082 DEBUG nova.virt.libvirt.host [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.133 187082 DEBUG nova.virt.libvirt.host [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.135 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.135 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.136 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.137 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.137 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.137 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.138 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.138 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.139 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.139 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.139 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.140 187082 DEBUG nova.virt.hardware [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.146 187082 DEBUG nova.virt.libvirt.vif [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:20:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1354131249',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1354131249',id=7,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5351147be4ee48e9b79d613fbb862cef',ramdisk_id='',reservation_id='r-t05ev9cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-882837159',owner_user_name='tempest-TestExecuteBasicStrategy-882837159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:21:00Z,user_data=None,user_id='f2260d70dd3d4ff8af8a70855d4981be',uuid=df72ea7f-d1a4-4c43-8112-40abfc528851,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.147 187082 DEBUG nova.network.os_vif_util [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Converting VIF {"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.148 187082 DEBUG nova.network.os_vif_util [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:3a:fb,bridge_name='br-int',has_traffic_filtering=True,id=eca02106-5d92-46e8-8a0a-51addd985c5f,network=Network(f47edc14-a00a-4224-95d0-485653fa3eb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca02106-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.149 187082 DEBUG nova.objects.instance [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lazy-loading 'pci_devices' on Instance uuid df72ea7f-d1a4-4c43-8112-40abfc528851 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.169 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <uuid>df72ea7f-d1a4-4c43-8112-40abfc528851</uuid>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <name>instance-00000007</name>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1354131249</nova:name>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:21:05</nova:creationTime>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:21:05 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:21:05 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:21:05 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:21:05 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:21:05 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:21:05 compute-1 nova_compute[187078]:         <nova:user uuid="f2260d70dd3d4ff8af8a70855d4981be">tempest-TestExecuteBasicStrategy-882837159-project-member</nova:user>
Nov 24 13:21:05 compute-1 nova_compute[187078]:         <nova:project uuid="5351147be4ee48e9b79d613fbb862cef">tempest-TestExecuteBasicStrategy-882837159</nova:project>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:21:05 compute-1 nova_compute[187078]:         <nova:port uuid="eca02106-5d92-46e8-8a0a-51addd985c5f">
Nov 24 13:21:05 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <system>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <entry name="serial">df72ea7f-d1a4-4c43-8112-40abfc528851</entry>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <entry name="uuid">df72ea7f-d1a4-4c43-8112-40abfc528851</entry>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </system>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <os>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   </os>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <features>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   </features>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk.config"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:ad:3a:fb"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <target dev="tapeca02106-5d"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/console.log" append="off"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <video>
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </video>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:21:05 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:21:05 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:21:05 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:21:05 compute-1 nova_compute[187078]: </domain>
Nov 24 13:21:05 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.171 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Preparing to wait for external event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.171 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.172 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.172 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.173 187082 DEBUG nova.virt.libvirt.vif [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:20:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1354131249',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1354131249',id=7,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5351147be4ee48e9b79d613fbb862cef',ramdisk_id='',reservation_id='r-t05ev9cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-882837159',owner_user_name='tempest-TestExecuteBasicStrategy-882837159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:21:00Z,user_data=None,user_id='f2260d70dd3d4ff8af8a70855d4981be',uuid=df72ea7f-d1a4-4c43-8112-40abfc528851,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.174 187082 DEBUG nova.network.os_vif_util [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Converting VIF {"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.175 187082 DEBUG nova.network.os_vif_util [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:3a:fb,bridge_name='br-int',has_traffic_filtering=True,id=eca02106-5d92-46e8-8a0a-51addd985c5f,network=Network(f47edc14-a00a-4224-95d0-485653fa3eb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca02106-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.175 187082 DEBUG os_vif [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:3a:fb,bridge_name='br-int',has_traffic_filtering=True,id=eca02106-5d92-46e8-8a0a-51addd985c5f,network=Network(f47edc14-a00a-4224-95d0-485653fa3eb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca02106-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.176 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.177 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.177 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.182 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.182 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeca02106-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.183 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeca02106-5d, col_values=(('external_ids', {'iface-id': 'eca02106-5d92-46e8-8a0a-51addd985c5f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:3a:fb', 'vm-uuid': 'df72ea7f-d1a4-4c43-8112-40abfc528851'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.213 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:05 compute-1 NetworkManager[55527]: <info>  [1763990465.2149] manager: (tapeca02106-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.216 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.222 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.223 187082 INFO os_vif [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:3a:fb,bridge_name='br-int',has_traffic_filtering=True,id=eca02106-5d92-46e8-8a0a-51addd985c5f,network=Network(f47edc14-a00a-4224-95d0-485653fa3eb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca02106-5d')
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.277 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.278 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.278 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] No VIF found with MAC fa:16:3e:ad:3a:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.278 187082 INFO nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Using config drive
Nov 24 13:21:05 compute-1 podman[197429]: time="2025-11-24T13:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:21:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:21:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2583 "" "Go-http-client/1.1"
Nov 24 13:21:05 compute-1 nova_compute[187078]: 2025-11-24 13:21:05.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.068 187082 INFO nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Creating config drive at /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk.config
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.077 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgtmtafeq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.201 187082 DEBUG oslo_concurrency.processutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgtmtafeq" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:06 compute-1 kernel: tapeca02106-5d: entered promiscuous mode
Nov 24 13:21:06 compute-1 NetworkManager[55527]: <info>  [1763990466.2679] manager: (tapeca02106-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Nov 24 13:21:06 compute-1 ovn_controller[95368]: 2025-11-24T13:21:06Z|00085|binding|INFO|Claiming lport eca02106-5d92-46e8-8a0a-51addd985c5f for this chassis.
Nov 24 13:21:06 compute-1 ovn_controller[95368]: 2025-11-24T13:21:06Z|00086|binding|INFO|eca02106-5d92-46e8-8a0a-51addd985c5f: Claiming fa:16:3e:ad:3a:fb 10.100.0.6
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.295 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.310 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:3a:fb 10.100.0.6'], port_security=['fa:16:3e:ad:3a:fb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'df72ea7f-d1a4-4c43-8112-40abfc528851', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47edc14-a00a-4224-95d0-485653fa3eb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5351147be4ee48e9b79d613fbb862cef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e4d88dfb-11f6-4768-9bef-248d999f9406', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf451210-30ae-4dba-b8ef-b220504e9c5f, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=eca02106-5d92-46e8-8a0a-51addd985c5f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.311 104225 INFO neutron.agent.ovn.metadata.agent [-] Port eca02106-5d92-46e8-8a0a-51addd985c5f in datapath f47edc14-a00a-4224-95d0-485653fa3eb5 bound to our chassis
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.312 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f47edc14-a00a-4224-95d0-485653fa3eb5
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.327 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[d029fc61-3166-4c54-b955-8c4217653406]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.328 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf47edc14-a1 in ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.331 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf47edc14-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.332 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2d03a433-c54e-4de5-b24a-b9938da89b5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.333 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5dcbf6-ef0f-478a-a024-a06e6e28b3e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 systemd-machined[153355]: New machine qemu-6-instance-00000007.
Nov 24 13:21:06 compute-1 systemd-udevd[210342]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.347 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[99741569-9fb1-43af-9dd3-d3cd2ca14953]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 NetworkManager[55527]: <info>  [1763990466.3602] device (tapeca02106-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:21:06 compute-1 NetworkManager[55527]: <info>  [1763990466.3616] device (tapeca02106-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.381 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.381 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[bad0172f-8b7e-479e-8793-01ed9194f639]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-00000007.
Nov 24 13:21:06 compute-1 ovn_controller[95368]: 2025-11-24T13:21:06Z|00087|binding|INFO|Setting lport eca02106-5d92-46e8-8a0a-51addd985c5f ovn-installed in OVS
Nov 24 13:21:06 compute-1 ovn_controller[95368]: 2025-11-24T13:21:06Z|00088|binding|INFO|Setting lport eca02106-5d92-46e8-8a0a-51addd985c5f up in Southbound
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.386 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.413 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[dd04b03f-0864-475d-8172-b52063270cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.419 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ccafbbfc-8402-4a26-940c-484a36ae1072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 NetworkManager[55527]: <info>  [1763990466.4205] manager: (tapf47edc14-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/38)
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.453 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[c39ecfb2-5b09-4a37-8c00-a701d3cfbad4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.456 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[98979c53-40a5-4a10-bdd6-1eaddef45b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 NetworkManager[55527]: <info>  [1763990466.4801] device (tapf47edc14-a0): carrier: link connected
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.484 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[6e2a2907-7cc7-44bd-ac22-51ffcef53baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.501 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[604acc36-cfa4-4b65-9143-7f3fded2da75]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf47edc14-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:5c:2c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348514, 'reachable_time': 41950, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210374, 'error': None, 'target': 'ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.521 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[6e95f8ea-6f1c-44f7-873d-1f477bdb9fa2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed2:5c2c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348514, 'tstamp': 348514}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210375, 'error': None, 'target': 'ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.538 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0718c879-6007-4862-b603-98ac623bc7f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf47edc14-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:5c:2c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348514, 'reachable_time': 41950, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210376, 'error': None, 'target': 'ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.575 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2e5bd995-93af-4bc7-a862-52888fdcd8d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.631 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[1ddec243-7485-4c89-b9b9-3200b6a27e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.634 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf47edc14-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.634 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.635 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf47edc14-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:21:06 compute-1 kernel: tapf47edc14-a0: entered promiscuous mode
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.637 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:06 compute-1 NetworkManager[55527]: <info>  [1763990466.6410] manager: (tapf47edc14-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.641 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf47edc14-a0, col_values=(('external_ids', {'iface-id': 'bb948684-0614-4cb7-b7b5-3b6b8613932a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:21:06 compute-1 ovn_controller[95368]: 2025-11-24T13:21:06Z|00089|binding|INFO|Releasing lport bb948684-0614-4cb7-b7b5-3b6b8613932a from this chassis (sb_readonly=0)
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.642 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.644 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f47edc14-a00a-4224-95d0-485653fa3eb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f47edc14-a00a-4224-95d0-485653fa3eb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.646 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2f11bb69-8c2a-435d-812e-186e13c80fd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.646 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-f47edc14-a00a-4224-95d0-485653fa3eb5
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/f47edc14-a00a-4224-95d0-485653fa3eb5.pid.haproxy
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID f47edc14-a00a-4224-95d0-485653fa3eb5
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:21:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:21:06.647 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5', 'env', 'PROCESS_TAG=haproxy-f47edc14-a00a-4224-95d0-485653fa3eb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f47edc14-a00a-4224-95d0-485653fa3eb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.654 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.690 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.690 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.691 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.759 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.822 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.823 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:06 compute-1 nova_compute[187078]: 2025-11-24 13:21:06.885 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.069 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990467.0686305, df72ea7f-d1a4-4c43-8112-40abfc528851 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.070 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] VM Started (Lifecycle Event)
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.101 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.106 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990467.069712, df72ea7f-d1a4-4c43-8112-40abfc528851 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.107 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] VM Paused (Lifecycle Event)
Nov 24 13:21:07 compute-1 podman[210419]: 2025-11-24 13:21:07.025564536 +0000 UTC m=+0.035510409 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.120 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.122 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5843MB free_disk=73.4611930847168GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.122 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.123 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.134 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.137 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.164 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.224 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance df72ea7f-d1a4-4c43-8112-40abfc528851 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.225 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.225 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:21:07 compute-1 podman[210419]: 2025-11-24 13:21:07.255165257 +0000 UTC m=+0.265111100 container create e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.278 187082 DEBUG nova.compute.manager [req-e0a7ca8e-8434-4199-b6a7-79d2c0b63018 req-d83d8cb0-3acf-4f9a-bbb2-bb2f2bb11a0d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.279 187082 DEBUG oslo_concurrency.lockutils [req-e0a7ca8e-8434-4199-b6a7-79d2c0b63018 req-d83d8cb0-3acf-4f9a-bbb2-bb2f2bb11a0d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.280 187082 DEBUG oslo_concurrency.lockutils [req-e0a7ca8e-8434-4199-b6a7-79d2c0b63018 req-d83d8cb0-3acf-4f9a-bbb2-bb2f2bb11a0d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.280 187082 DEBUG oslo_concurrency.lockutils [req-e0a7ca8e-8434-4199-b6a7-79d2c0b63018 req-d83d8cb0-3acf-4f9a-bbb2-bb2f2bb11a0d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.280 187082 DEBUG nova.compute.manager [req-e0a7ca8e-8434-4199-b6a7-79d2c0b63018 req-d83d8cb0-3acf-4f9a-bbb2-bb2f2bb11a0d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Processing event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.282 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.284 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.287 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990467.2869725, df72ea7f-d1a4-4c43-8112-40abfc528851 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.288 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] VM Resumed (Lifecycle Event)
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.290 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.294 187082 INFO nova.virt.libvirt.driver [-] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Instance spawned successfully.
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.294 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:21:07 compute-1 systemd[1]: Started libpod-conmon-e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee.scope.
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.304 187082 DEBUG nova.network.neutron [req-46bed593-bc3f-484d-bb72-d2cc7b7fbf80 req-5bb61596-747f-44d5-acb6-02c505704020 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updated VIF entry in instance network info cache for port eca02106-5d92-46e8-8a0a-51addd985c5f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.305 187082 DEBUG nova.network.neutron [req-46bed593-bc3f-484d-bb72-d2cc7b7fbf80 req-5bb61596-747f-44d5-acb6-02c505704020 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updating instance_info_cache with network_info: [{"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.314 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.320 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.324 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.333 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.334 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.334 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.335 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.335 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.335 187082 DEBUG nova.virt.libvirt.driver [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.339 187082 DEBUG oslo_concurrency.lockutils [req-46bed593-bc3f-484d-bb72-d2cc7b7fbf80 req-5bb61596-747f-44d5-acb6-02c505704020 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.340 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.340 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.341 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:21:07 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.374 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0fde7aea16683e9a6c1ebdc4a26d3c5213ebda78c127e2bf5a686ca19a0d3c87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.386 187082 INFO nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Took 7.19 seconds to spawn the instance on the hypervisor.
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.387 187082 DEBUG nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:21:07 compute-1 podman[210419]: 2025-11-24 13:21:07.401694683 +0000 UTC m=+0.411640546 container init e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 24 13:21:07 compute-1 podman[210419]: 2025-11-24 13:21:07.40928755 +0000 UTC m=+0.419233393 container start e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 24 13:21:07 compute-1 neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5[210435]: [NOTICE]   (210439) : New worker (210441) forked
Nov 24 13:21:07 compute-1 neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5[210435]: [NOTICE]   (210439) : Loading success.
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.446 187082 INFO nova.compute.manager [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Took 7.58 seconds to build instance.
Nov 24 13:21:07 compute-1 nova_compute[187078]: 2025-11-24 13:21:07.456 187082 DEBUG oslo_concurrency.lockutils [None req-5932c727-9dfc-4a59-a33b-3056944b9f82 f2260d70dd3d4ff8af8a70855d4981be 5351147be4ee48e9b79d613fbb862cef - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:08 compute-1 nova_compute[187078]: 2025-11-24 13:21:08.341 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:08 compute-1 nova_compute[187078]: 2025-11-24 13:21:08.342 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:08 compute-1 nova_compute[187078]: 2025-11-24 13:21:08.342 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:21:08 compute-1 podman[210452]: 2025-11-24 13:21:08.548500185 +0000 UTC m=+0.078748139 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 24 13:21:08 compute-1 nova_compute[187078]: 2025-11-24 13:21:08.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:08 compute-1 sshd-session[210450]: Invalid user fan from 176.114.89.34 port 55900
Nov 24 13:21:08 compute-1 sshd-session[210450]: Received disconnect from 176.114.89.34 port 55900:11: Bye Bye [preauth]
Nov 24 13:21:08 compute-1 sshd-session[210450]: Disconnected from invalid user fan 176.114.89.34 port 55900 [preauth]
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.337 187082 DEBUG nova.compute.manager [req-b448859a-65da-49a8-822b-b8f5226dcc6c req-851b17de-5164-4922-bb9f-406cdc22b3ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.338 187082 DEBUG oslo_concurrency.lockutils [req-b448859a-65da-49a8-822b-b8f5226dcc6c req-851b17de-5164-4922-bb9f-406cdc22b3ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.339 187082 DEBUG oslo_concurrency.lockutils [req-b448859a-65da-49a8-822b-b8f5226dcc6c req-851b17de-5164-4922-bb9f-406cdc22b3ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.339 187082 DEBUG oslo_concurrency.lockutils [req-b448859a-65da-49a8-822b-b8f5226dcc6c req-851b17de-5164-4922-bb9f-406cdc22b3ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.340 187082 DEBUG nova.compute.manager [req-b448859a-65da-49a8-822b-b8f5226dcc6c req-851b17de-5164-4922-bb9f-406cdc22b3ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.340 187082 WARNING nova.compute.manager [req-b448859a-65da-49a8-822b-b8f5226dcc6c req-851b17de-5164-4922-bb9f-406cdc22b3ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received unexpected event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with vm_state active and task_state None.
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.849 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.850 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.850 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:21:09 compute-1 nova_compute[187078]: 2025-11-24 13:21:09.850 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid df72ea7f-d1a4-4c43-8112-40abfc528851 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:21:10 compute-1 nova_compute[187078]: 2025-11-24 13:21:10.215 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:12 compute-1 nova_compute[187078]: 2025-11-24 13:21:12.163 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updating instance_info_cache with network_info: [{"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:21:12 compute-1 nova_compute[187078]: 2025-11-24 13:21:12.302 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:21:12 compute-1 nova_compute[187078]: 2025-11-24 13:21:12.303 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:21:12 compute-1 nova_compute[187078]: 2025-11-24 13:21:12.375 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:15 compute-1 nova_compute[187078]: 2025-11-24 13:21:15.220 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:17 compute-1 nova_compute[187078]: 2025-11-24 13:21:17.378 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:19 compute-1 openstack_network_exporter[199599]: ERROR   13:21:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:21:19 compute-1 openstack_network_exporter[199599]: ERROR   13:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:21:19 compute-1 openstack_network_exporter[199599]: ERROR   13:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:21:19 compute-1 openstack_network_exporter[199599]: ERROR   13:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:21:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:21:19 compute-1 openstack_network_exporter[199599]: ERROR   13:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:21:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:21:20 compute-1 nova_compute[187078]: 2025-11-24 13:21:20.222 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:21 compute-1 ovn_controller[95368]: 2025-11-24T13:21:21Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ad:3a:fb 10.100.0.6
Nov 24 13:21:21 compute-1 ovn_controller[95368]: 2025-11-24T13:21:21Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ad:3a:fb 10.100.0.6
Nov 24 13:21:22 compute-1 nova_compute[187078]: 2025-11-24 13:21:22.380 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:24 compute-1 podman[210486]: 2025-11-24 13:21:24.552215944 +0000 UTC m=+0.086333985 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:21:25 compute-1 nova_compute[187078]: 2025-11-24 13:21:25.226 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:25 compute-1 podman[210512]: 2025-11-24 13:21:25.521683884 +0000 UTC m=+0.064816776 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 24 13:21:27 compute-1 sshd-session[210532]: Invalid user python from 68.183.82.237 port 34656
Nov 24 13:21:27 compute-1 nova_compute[187078]: 2025-11-24 13:21:27.381 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:27 compute-1 sshd-session[210532]: Received disconnect from 68.183.82.237 port 34656:11: Bye Bye [preauth]
Nov 24 13:21:27 compute-1 sshd-session[210532]: Disconnected from invalid user python 68.183.82.237 port 34656 [preauth]
Nov 24 13:21:30 compute-1 nova_compute[187078]: 2025-11-24 13:21:30.231 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:31 compute-1 podman[210534]: 2025-11-24 13:21:31.542771093 +0000 UTC m=+0.083503478 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:21:31 compute-1 podman[210535]: 2025-11-24 13:21:31.596059802 +0000 UTC m=+0.120341607 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:21:32 compute-1 nova_compute[187078]: 2025-11-24 13:21:32.383 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:35 compute-1 nova_compute[187078]: 2025-11-24 13:21:35.235 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:35 compute-1 podman[197429]: time="2025-11-24T13:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:21:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:21:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3050 "" "Go-http-client/1.1"
Nov 24 13:21:37 compute-1 nova_compute[187078]: 2025-11-24 13:21:37.385 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:39 compute-1 podman[210583]: 2025-11-24 13:21:39.537972833 +0000 UTC m=+0.074834981 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 13:21:40 compute-1 nova_compute[187078]: 2025-11-24 13:21:40.239 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:42 compute-1 sshd-session[210581]: Invalid user redmine from 45.78.194.40 port 59908
Nov 24 13:21:42 compute-1 sshd-session[210581]: Received disconnect from 45.78.194.40 port 59908:11: Bye Bye [preauth]
Nov 24 13:21:42 compute-1 sshd-session[210581]: Disconnected from invalid user redmine 45.78.194.40 port 59908 [preauth]
Nov 24 13:21:42 compute-1 nova_compute[187078]: 2025-11-24 13:21:42.387 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:45 compute-1 nova_compute[187078]: 2025-11-24 13:21:45.302 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:46 compute-1 ovn_controller[95368]: 2025-11-24T13:21:46Z|00090|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 24 13:21:47 compute-1 nova_compute[187078]: 2025-11-24 13:21:47.390 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:49 compute-1 openstack_network_exporter[199599]: ERROR   13:21:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:21:49 compute-1 openstack_network_exporter[199599]: ERROR   13:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:21:49 compute-1 openstack_network_exporter[199599]: ERROR   13:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:21:49 compute-1 openstack_network_exporter[199599]: ERROR   13:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:21:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:21:49 compute-1 openstack_network_exporter[199599]: ERROR   13:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:21:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:21:50 compute-1 nova_compute[187078]: 2025-11-24 13:21:50.304 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:50 compute-1 nova_compute[187078]: 2025-11-24 13:21:50.466 187082 DEBUG nova.compute.manager [None req-43de1158-8ca0-4e66-8f97-2c9b9d8c57a4 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Nov 24 13:21:50 compute-1 nova_compute[187078]: 2025-11-24 13:21:50.514 187082 DEBUG nova.compute.provider_tree [None req-43de1158-8ca0-4e66-8f97-2c9b9d8c57a4 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 4 to 14 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:21:52 compute-1 nova_compute[187078]: 2025-11-24 13:21:52.392 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:52 compute-1 sshd-session[210606]: Invalid user developer from 85.209.134.43 port 57588
Nov 24 13:21:52 compute-1 sshd-session[210606]: Received disconnect from 85.209.134.43 port 57588:11: Bye Bye [preauth]
Nov 24 13:21:52 compute-1 sshd-session[210606]: Disconnected from invalid user developer 85.209.134.43 port 57588 [preauth]
Nov 24 13:21:54 compute-1 sshd-session[210608]: Invalid user testuser from 5.198.176.28 port 43804
Nov 24 13:21:54 compute-1 sshd-session[210608]: Received disconnect from 5.198.176.28 port 43804:11: Bye Bye [preauth]
Nov 24 13:21:54 compute-1 sshd-session[210608]: Disconnected from invalid user testuser 5.198.176.28 port 43804 [preauth]
Nov 24 13:21:55 compute-1 nova_compute[187078]: 2025-11-24 13:21:55.307 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:55 compute-1 podman[210610]: 2025-11-24 13:21:55.56037416 +0000 UTC m=+0.092854974 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:21:55 compute-1 podman[210634]: 2025-11-24 13:21:55.670732752 +0000 UTC m=+0.074511851 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 13:21:56 compute-1 nova_compute[187078]: 2025-11-24 13:21:56.700 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Check if temp file /var/lib/nova/instances/tmptiawg8qp exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 24 13:21:56 compute-1 nova_compute[187078]: 2025-11-24 13:21:56.700 187082 DEBUG nova.compute.manager [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptiawg8qp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='df72ea7f-d1a4-4c43-8112-40abfc528851',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 24 13:21:57 compute-1 nova_compute[187078]: 2025-11-24 13:21:57.395 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:21:59 compute-1 nova_compute[187078]: 2025-11-24 13:21:59.896 187082 DEBUG oslo_concurrency.processutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:21:59 compute-1 nova_compute[187078]: 2025-11-24 13:21:59.958 187082 DEBUG oslo_concurrency.processutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:21:59 compute-1 nova_compute[187078]: 2025-11-24 13:21:59.960 187082 DEBUG oslo_concurrency.processutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:22:00 compute-1 nova_compute[187078]: 2025-11-24 13:22:00.066 187082 DEBUG oslo_concurrency.processutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:22:00 compute-1 nova_compute[187078]: 2025-11-24 13:22:00.311 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:02 compute-1 nova_compute[187078]: 2025-11-24 13:22:02.397 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:02 compute-1 podman[210661]: 2025-11-24 13:22:02.534744685 +0000 UTC m=+0.080806774 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 13:22:02 compute-1 podman[210662]: 2025-11-24 13:22:02.593189376 +0000 UTC m=+0.132053997 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:22:02 compute-1 nova_compute[187078]: 2025-11-24 13:22:02.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:03 compute-1 sshd-session[210707]: Accepted publickey for nova from 192.168.122.100 port 57398 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:22:03 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Nov 24 13:22:03 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 24 13:22:03 compute-1 systemd-logind[815]: New session 35 of user nova.
Nov 24 13:22:03 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 24 13:22:03 compute-1 systemd[1]: Starting User Manager for UID 42436...
Nov 24 13:22:03 compute-1 systemd[210711]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:22:03 compute-1 systemd[210711]: Queued start job for default target Main User Target.
Nov 24 13:22:03 compute-1 systemd[210711]: Created slice User Application Slice.
Nov 24 13:22:03 compute-1 systemd[210711]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:22:03 compute-1 systemd[210711]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:22:03 compute-1 systemd[210711]: Reached target Paths.
Nov 24 13:22:03 compute-1 systemd[210711]: Reached target Timers.
Nov 24 13:22:03 compute-1 systemd[210711]: Starting D-Bus User Message Bus Socket...
Nov 24 13:22:03 compute-1 systemd[210711]: Starting Create User's Volatile Files and Directories...
Nov 24 13:22:03 compute-1 systemd[210711]: Finished Create User's Volatile Files and Directories.
Nov 24 13:22:03 compute-1 systemd[210711]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:22:03 compute-1 systemd[210711]: Reached target Sockets.
Nov 24 13:22:03 compute-1 systemd[210711]: Reached target Basic System.
Nov 24 13:22:03 compute-1 systemd[210711]: Reached target Main User Target.
Nov 24 13:22:03 compute-1 systemd[210711]: Startup finished in 171ms.
Nov 24 13:22:03 compute-1 systemd[1]: Started User Manager for UID 42436.
Nov 24 13:22:03 compute-1 systemd[1]: Started Session 35 of User nova.
Nov 24 13:22:03 compute-1 sshd-session[210707]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:22:03 compute-1 sshd-session[210726]: Received disconnect from 192.168.122.100 port 57398:11: disconnected by user
Nov 24 13:22:03 compute-1 sshd-session[210726]: Disconnected from user nova 192.168.122.100 port 57398
Nov 24 13:22:03 compute-1 sshd-session[210707]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:22:03 compute-1 systemd[1]: session-35.scope: Deactivated successfully.
Nov 24 13:22:03 compute-1 systemd-logind[815]: Session 35 logged out. Waiting for processes to exit.
Nov 24 13:22:03 compute-1 systemd-logind[815]: Removed session 35.
Nov 24 13:22:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:04.150 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:04.152 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:04.153 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:04.383 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.385 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:04.387 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.418 187082 DEBUG nova.compute.manager [req-2bee804f-9c6d-4505-8de0-9cbe449f39e4 req-9ea920de-f4a6-4657-b42a-74ef1ac28cf3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.419 187082 DEBUG oslo_concurrency.lockutils [req-2bee804f-9c6d-4505-8de0-9cbe449f39e4 req-9ea920de-f4a6-4657-b42a-74ef1ac28cf3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.420 187082 DEBUG oslo_concurrency.lockutils [req-2bee804f-9c6d-4505-8de0-9cbe449f39e4 req-9ea920de-f4a6-4657-b42a-74ef1ac28cf3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.420 187082 DEBUG oslo_concurrency.lockutils [req-2bee804f-9c6d-4505-8de0-9cbe449f39e4 req-9ea920de-f4a6-4657-b42a-74ef1ac28cf3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.421 187082 DEBUG nova.compute.manager [req-2bee804f-9c6d-4505-8de0-9cbe449f39e4 req-9ea920de-f4a6-4657-b42a-74ef1ac28cf3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.421 187082 DEBUG nova.compute.manager [req-2bee804f-9c6d-4505-8de0-9cbe449f39e4 req-9ea920de-f4a6-4657-b42a-74ef1ac28cf3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.678 187082 INFO nova.compute.manager [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Took 4.61 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.679 187082 DEBUG nova.compute.manager [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.704 187082 DEBUG nova.compute.manager [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptiawg8qp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='df72ea7f-d1a4-4c43-8112-40abfc528851',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(c7c0b62e-ae3a-43f9-ae7c-db80dcb3f27d),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.733 187082 DEBUG nova.objects.instance [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid df72ea7f-d1a4-4c43-8112-40abfc528851 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.736 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.739 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.740 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.759 187082 DEBUG nova.virt.libvirt.vif [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:20:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1354131249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1354131249',id=7,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:21:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5351147be4ee48e9b79d613fbb862cef',ramdisk_id='',reservation_id='r-t05ev9cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-882837159',owner_user_name='tempest-TestExecuteBasicStrategy-882837159-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:21:07Z,user_data=None,user_id='f2260d70dd3d4ff8af8a70855d4981be',uuid=df72ea7f-d1a4-4c43-8112-40abfc528851,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.760 187082 DEBUG nova.network.os_vif_util [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.761 187082 DEBUG nova.network.os_vif_util [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:3a:fb,bridge_name='br-int',has_traffic_filtering=True,id=eca02106-5d92-46e8-8a0a-51addd985c5f,network=Network(f47edc14-a00a-4224-95d0-485653fa3eb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca02106-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.761 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updating guest XML with vif config: <interface type="ethernet">
Nov 24 13:22:04 compute-1 nova_compute[187078]:   <mac address="fa:16:3e:ad:3a:fb"/>
Nov 24 13:22:04 compute-1 nova_compute[187078]:   <model type="virtio"/>
Nov 24 13:22:04 compute-1 nova_compute[187078]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:22:04 compute-1 nova_compute[187078]:   <mtu size="1442"/>
Nov 24 13:22:04 compute-1 nova_compute[187078]:   <target dev="tapeca02106-5d"/>
Nov 24 13:22:04 compute-1 nova_compute[187078]: </interface>
Nov 24 13:22:04 compute-1 nova_compute[187078]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 24 13:22:04 compute-1 nova_compute[187078]: 2025-11-24 13:22:04.762 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 24 13:22:05 compute-1 nova_compute[187078]: 2025-11-24 13:22:05.243 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:22:05 compute-1 nova_compute[187078]: 2025-11-24 13:22:05.244 187082 INFO nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 24 13:22:05 compute-1 nova_compute[187078]: 2025-11-24 13:22:05.313 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:05 compute-1 nova_compute[187078]: 2025-11-24 13:22:05.323 187082 INFO nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 24 13:22:05 compute-1 podman[197429]: time="2025-11-24T13:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:22:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:22:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Nov 24 13:22:05 compute-1 nova_compute[187078]: 2025-11-24 13:22:05.826 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:22:05 compute-1 nova_compute[187078]: 2025-11-24 13:22:05.827 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.389 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.389 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.514 187082 DEBUG nova.compute.manager [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.514 187082 DEBUG oslo_concurrency.lockutils [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.515 187082 DEBUG oslo_concurrency.lockutils [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.515 187082 DEBUG oslo_concurrency.lockutils [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.515 187082 DEBUG nova.compute.manager [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.516 187082 WARNING nova.compute.manager [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received unexpected event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with vm_state active and task_state migrating.
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.516 187082 DEBUG nova.compute.manager [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-changed-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.517 187082 DEBUG nova.compute.manager [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Refreshing instance network info cache due to event network-changed-eca02106-5d92-46e8-8a0a-51addd985c5f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.517 187082 DEBUG oslo_concurrency.lockutils [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.517 187082 DEBUG oslo_concurrency.lockutils [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.518 187082 DEBUG nova.network.neutron [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Refreshing network info cache for port eca02106-5d92-46e8-8a0a-51addd985c5f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.894 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:22:06 compute-1 nova_compute[187078]: 2025-11-24 13:22:06.895 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.400 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.400 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.401 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.661 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.730 187082 DEBUG nova.network.neutron [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updated VIF entry in instance network info cache for port eca02106-5d92-46e8-8a0a-51addd985c5f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.731 187082 DEBUG nova.network.neutron [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updating instance_info_cache with network_info: [{"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.904 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:22:07 compute-1 nova_compute[187078]: 2025-11-24 13:22:07.905 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.354 187082 DEBUG oslo_concurrency.lockutils [req-bd04ea12-c74a-456e-b363-0af44616a9ed req-e2886d36-f2f6-4bb3-af58-ffaa03ebc50a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.411 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.411 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.700 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.701 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.701 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.701 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.792 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.895 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990528.8941524, df72ea7f-d1a4-4c43-8112-40abfc528851 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.895 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] VM Paused (Lifecycle Event)
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.900 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.900 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.926 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.928 187082 DEBUG nova.virt.libvirt.migration [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.932 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.937 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.965 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 24 13:22:08 compute-1 nova_compute[187078]: 2025-11-24 13:22:08.969 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:22:09 compute-1 kernel: tapeca02106-5d (unregistering): left promiscuous mode
Nov 24 13:22:09 compute-1 NetworkManager[55527]: <info>  [1763990529.0681] device (tapeca02106-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.117 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:09 compute-1 ovn_controller[95368]: 2025-11-24T13:22:09Z|00091|binding|INFO|Releasing lport eca02106-5d92-46e8-8a0a-51addd985c5f from this chassis (sb_readonly=0)
Nov 24 13:22:09 compute-1 ovn_controller[95368]: 2025-11-24T13:22:09Z|00092|binding|INFO|Setting lport eca02106-5d92-46e8-8a0a-51addd985c5f down in Southbound
Nov 24 13:22:09 compute-1 ovn_controller[95368]: 2025-11-24T13:22:09Z|00093|binding|INFO|Removing iface tapeca02106-5d ovn-installed in OVS
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.121 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.133 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:3a:fb 10.100.0.6'], port_security=['fa:16:3e:ad:3a:fb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'df72ea7f-d1a4-4c43-8112-40abfc528851', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47edc14-a00a-4224-95d0-485653fa3eb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5351147be4ee48e9b79d613fbb862cef', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e4d88dfb-11f6-4768-9bef-248d999f9406', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf451210-30ae-4dba-b8ef-b220504e9c5f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=eca02106-5d92-46e8-8a0a-51addd985c5f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.135 104225 INFO neutron.agent.ovn.metadata.agent [-] Port eca02106-5d92-46e8-8a0a-51addd985c5f in datapath f47edc14-a00a-4224-95d0-485653fa3eb5 unbound from our chassis
Nov 24 13:22:09 compute-1 virtqemud[186628]: An error occurred, but the cause is unknown
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.139 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47edc14-a00a-4224-95d0-485653fa3eb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.142 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[abec954f-fc0c-41ed-905e-29c63a8b6e9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.142 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5 namespace which is not needed anymore
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.143 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:09 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 24 13:22:09 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000007.scope: Consumed 17.193s CPU time.
Nov 24 13:22:09 compute-1 systemd-machined[153355]: Machine qemu-6-instance-00000007 terminated.
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.302 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.303 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5649MB free_disk=73.43291854858398GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.303 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.304 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.312 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.312 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.313 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.350 187082 INFO nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updating resource usage from migration c7c0b62e-ae3a-43f9-ae7c-db80dcb3f27d
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.355 187082 DEBUG nova.compute.manager [req-e086bfa9-dfd8-4119-a497-c4d3d64f010d req-8e6d41e6-8049-41f7-add3-ee02a925b338 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.356 187082 DEBUG oslo_concurrency.lockutils [req-e086bfa9-dfd8-4119-a497-c4d3d64f010d req-8e6d41e6-8049-41f7-add3-ee02a925b338 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.356 187082 DEBUG oslo_concurrency.lockutils [req-e086bfa9-dfd8-4119-a497-c4d3d64f010d req-8e6d41e6-8049-41f7-add3-ee02a925b338 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.356 187082 DEBUG oslo_concurrency.lockutils [req-e086bfa9-dfd8-4119-a497-c4d3d64f010d req-8e6d41e6-8049-41f7-add3-ee02a925b338 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.357 187082 DEBUG nova.compute.manager [req-e086bfa9-dfd8-4119-a497-c4d3d64f010d req-8e6d41e6-8049-41f7-add3-ee02a925b338 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.357 187082 DEBUG nova.compute.manager [req-e086bfa9-dfd8-4119-a497-c4d3d64f010d req-8e6d41e6-8049-41f7-add3-ee02a925b338 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:22:09 compute-1 neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5[210435]: [NOTICE]   (210439) : haproxy version is 2.8.14-c23fe91
Nov 24 13:22:09 compute-1 neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5[210435]: [NOTICE]   (210439) : path to executable is /usr/sbin/haproxy
Nov 24 13:22:09 compute-1 neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5[210435]: [WARNING]  (210439) : Exiting Master process...
Nov 24 13:22:09 compute-1 neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5[210435]: [ALERT]    (210439) : Current worker (210441) exited with code 143 (Terminated)
Nov 24 13:22:09 compute-1 neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5[210435]: [WARNING]  (210439) : All workers exited. Exiting... (0)
Nov 24 13:22:09 compute-1 systemd[1]: libpod-e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee.scope: Deactivated successfully.
Nov 24 13:22:09 compute-1 podman[210774]: 2025-11-24 13:22:09.38457691 +0000 UTC m=+0.136019737 container died e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.387 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Migration c7c0b62e-ae3a-43f9-ae7c-db80dcb3f27d is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.388 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.388 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:22:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee-userdata-shm.mount: Deactivated successfully.
Nov 24 13:22:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-0fde7aea16683e9a6c1ebdc4a26d3c5213ebda78c127e2bf5a686ca19a0d3c87-merged.mount: Deactivated successfully.
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.432 187082 DEBUG nova.virt.libvirt.guest [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'df72ea7f-d1a4-4c43-8112-40abfc528851' (instance-00000007) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.434 187082 INFO nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Migration operation has completed
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.434 187082 INFO nova.compute.manager [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] _post_live_migration() is started..
Nov 24 13:22:09 compute-1 podman[210774]: 2025-11-24 13:22:09.444560792 +0000 UTC m=+0.196003609 container cleanup e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.445 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:22:09 compute-1 systemd[1]: libpod-conmon-e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee.scope: Deactivated successfully.
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.459 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.488 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.489 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:09 compute-1 podman[210819]: 2025-11-24 13:22:09.526196188 +0000 UTC m=+0.052999722 container remove e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.533 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[352b95e6-8835-4192-89e9-7822a5ee93c9]: (4, ('Mon Nov 24 01:22:09 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5 (e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee)\ne89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee\nMon Nov 24 01:22:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5 (e89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee)\ne89a45ff499e04103527e290b75eef9ebac01b27adc497eeb94f5683ce0b46ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.535 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f64a3738-8ab9-4f02-9393-1d729cc27dd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.536 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf47edc14-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.539 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:09 compute-1 kernel: tapf47edc14-a0: left promiscuous mode
Nov 24 13:22:09 compute-1 nova_compute[187078]: 2025-11-24 13:22:09.560 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.563 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[520859c5-ffeb-4748-a7c5-589116c7705e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.580 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e1964976-2c6b-4e89-8d0f-bd7b2ce09c18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.582 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[78b24e64-a808-4392-8250-b6b180374f97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.601 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a8cb8e9c-ac07-4e15-bda4-6e77610a09b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348506, 'reachable_time': 40842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210839, 'error': None, 'target': 'ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:22:09 compute-1 systemd[1]: run-netns-ovnmeta\x2df47edc14\x2da00a\x2d4224\x2d95d0\x2d485653fa3eb5.mount: Deactivated successfully.
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.609 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f47edc14-a00a-4224-95d0-485653fa3eb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:22:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:09.609 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[baa0989b-e24d-4437-a770-ec3f963c8145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:22:09 compute-1 podman[210837]: 2025-11-24 13:22:09.684847952 +0000 UTC m=+0.070356097 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, version=9.6, architecture=x86_64, release=1755695350)
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.042 187082 DEBUG nova.network.neutron [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Activated binding for port eca02106-5d92-46e8-8a0a-51addd985c5f and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.042 187082 DEBUG nova.compute.manager [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.043 187082 DEBUG nova.virt.libvirt.vif [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:20:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1354131249',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1354131249',id=7,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:21:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5351147be4ee48e9b79d613fbb862cef',ramdisk_id='',reservation_id='r-t05ev9cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-882837159',owner_user_name='tempest-TestExecuteBasicStrategy-882837159-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:21:53Z,user_data=None,user_id='f2260d70dd3d4ff8af8a70855d4981be',uuid=df72ea7f-d1a4-4c43-8112-40abfc528851,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.044 187082 DEBUG nova.network.os_vif_util [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.045 187082 DEBUG nova.network.os_vif_util [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:3a:fb,bridge_name='br-int',has_traffic_filtering=True,id=eca02106-5d92-46e8-8a0a-51addd985c5f,network=Network(f47edc14-a00a-4224-95d0-485653fa3eb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca02106-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.045 187082 DEBUG os_vif [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:3a:fb,bridge_name='br-int',has_traffic_filtering=True,id=eca02106-5d92-46e8-8a0a-51addd985c5f,network=Network(f47edc14-a00a-4224-95d0-485653fa3eb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca02106-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.047 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.047 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeca02106-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.049 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.050 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.053 187082 INFO os_vif [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:3a:fb,bridge_name='br-int',has_traffic_filtering=True,id=eca02106-5d92-46e8-8a0a-51addd985c5f,network=Network(f47edc14-a00a-4224-95d0-485653fa3eb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca02106-5d')
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.054 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.054 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.054 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.055 187082 DEBUG nova.compute.manager [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.055 187082 INFO nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Deleting instance files /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851_del
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.056 187082 INFO nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Deletion of /var/lib/nova/instances/df72ea7f-d1a4-4c43-8112-40abfc528851_del complete
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.487 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.487 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.488 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.506 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.506 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.506 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:22:10 compute-1 nova_compute[187078]: 2025-11-24 13:22:10.507 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid df72ea7f-d1a4-4c43-8112-40abfc528851 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:22:11 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:22:11.390 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.418 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.418 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.418 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.418 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.419 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.419 187082 WARNING nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received unexpected event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with vm_state active and task_state migrating.
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.419 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.419 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.419 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.420 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.420 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.420 187082 WARNING nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received unexpected event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with vm_state active and task_state migrating.
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.420 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.420 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.421 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.421 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.421 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.421 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-unplugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.421 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.422 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.422 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.422 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.422 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.422 187082 WARNING nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received unexpected event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with vm_state active and task_state migrating.
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.422 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.423 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.423 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.423 187082 DEBUG oslo_concurrency.lockutils [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.423 187082 DEBUG nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] No waiting events found dispatching network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.423 187082 WARNING nova.compute.manager [req-8bb46352-660e-453f-8097-d17dbc7fc906 req-5a00019b-a621-4a8c-9e4a-7170d8f04217 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Received unexpected event network-vif-plugged-eca02106-5d92-46e8-8a0a-51addd985c5f for instance with vm_state active and task_state migrating.
Nov 24 13:22:11 compute-1 nova_compute[187078]: 2025-11-24 13:22:11.637 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updating instance_info_cache with network_info: [{"id": "eca02106-5d92-46e8-8a0a-51addd985c5f", "address": "fa:16:3e:ad:3a:fb", "network": {"id": "f47edc14-a00a-4224-95d0-485653fa3eb5", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1402171314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5351147be4ee48e9b79d613fbb862cef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca02106-5d", "ovs_interfaceid": "eca02106-5d92-46e8-8a0a-51addd985c5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:22:12 compute-1 nova_compute[187078]: 2025-11-24 13:22:12.402 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:12 compute-1 nova_compute[187078]: 2025-11-24 13:22:12.963 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-df72ea7f-d1a4-4c43-8112-40abfc528851" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:22:12 compute-1 nova_compute[187078]: 2025-11-24 13:22:12.964 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:22:12 compute-1 nova_compute[187078]: 2025-11-24 13:22:12.965 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:12 compute-1 nova_compute[187078]: 2025-11-24 13:22:12.965 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:22:13 compute-1 sshd-session[210860]: Invalid user ts3 from 176.114.89.34 port 40580
Nov 24 13:22:13 compute-1 sshd-session[210860]: Received disconnect from 176.114.89.34 port 40580:11: Bye Bye [preauth]
Nov 24 13:22:13 compute-1 sshd-session[210860]: Disconnected from invalid user ts3 176.114.89.34 port 40580 [preauth]
Nov 24 13:22:13 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Nov 24 13:22:13 compute-1 systemd[210711]: Activating special unit Exit the Session...
Nov 24 13:22:13 compute-1 systemd[210711]: Stopped target Main User Target.
Nov 24 13:22:13 compute-1 systemd[210711]: Stopped target Basic System.
Nov 24 13:22:13 compute-1 systemd[210711]: Stopped target Paths.
Nov 24 13:22:13 compute-1 systemd[210711]: Stopped target Sockets.
Nov 24 13:22:13 compute-1 systemd[210711]: Stopped target Timers.
Nov 24 13:22:13 compute-1 systemd[210711]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:22:13 compute-1 systemd[210711]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:22:13 compute-1 systemd[210711]: Closed D-Bus User Message Bus Socket.
Nov 24 13:22:13 compute-1 systemd[210711]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:22:13 compute-1 systemd[210711]: Removed slice User Application Slice.
Nov 24 13:22:13 compute-1 systemd[210711]: Reached target Shutdown.
Nov 24 13:22:13 compute-1 systemd[210711]: Finished Exit the Session.
Nov 24 13:22:13 compute-1 systemd[210711]: Reached target Exit the Session.
Nov 24 13:22:13 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Nov 24 13:22:13 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Nov 24 13:22:13 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 24 13:22:13 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 24 13:22:13 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 24 13:22:13 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 24 13:22:13 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Nov 24 13:22:14 compute-1 nova_compute[187078]: 2025-11-24 13:22:14.137 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:22:15 compute-1 nova_compute[187078]: 2025-11-24 13:22:15.050 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.177 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.178 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.179 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df72ea7f-d1a4-4c43-8112-40abfc528851-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.285 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.286 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.286 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.286 187082 DEBUG nova.compute.resource_tracker [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.487 187082 WARNING nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.489 187082 DEBUG nova.compute.resource_tracker [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5874MB free_disk=73.4619369506836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.489 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.490 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.544 187082 DEBUG nova.compute.resource_tracker [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration for instance df72ea7f-d1a4-4c43-8112-40abfc528851 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.566 187082 DEBUG nova.compute.resource_tracker [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.609 187082 DEBUG nova.compute.resource_tracker [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration c7c0b62e-ae3a-43f9-ae7c-db80dcb3f27d is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.610 187082 DEBUG nova.compute.resource_tracker [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.610 187082 DEBUG nova.compute.resource_tracker [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.669 187082 DEBUG nova.compute.provider_tree [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.689 187082 DEBUG nova.scheduler.client.report [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.738 187082 DEBUG nova.compute.resource_tracker [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.739 187082 DEBUG oslo_concurrency.lockutils [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.748 187082 INFO nova.compute.manager [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.915 187082 INFO nova.scheduler.client.report [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration c7c0b62e-ae3a-43f9-ae7c-db80dcb3f27d
Nov 24 13:22:16 compute-1 nova_compute[187078]: 2025-11-24 13:22:16.916 187082 DEBUG nova.virt.libvirt.driver [None req-0eea0239-2488-4f80-ab69-444500eadf99 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 24 13:22:17 compute-1 nova_compute[187078]: 2025-11-24 13:22:17.404 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:19 compute-1 openstack_network_exporter[199599]: ERROR   13:22:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:22:19 compute-1 openstack_network_exporter[199599]: ERROR   13:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:22:19 compute-1 openstack_network_exporter[199599]: ERROR   13:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:22:19 compute-1 openstack_network_exporter[199599]: ERROR   13:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:22:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:22:19 compute-1 openstack_network_exporter[199599]: ERROR   13:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:22:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:22:20 compute-1 nova_compute[187078]: 2025-11-24 13:22:20.053 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:22 compute-1 nova_compute[187078]: 2025-11-24 13:22:22.406 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:24 compute-1 nova_compute[187078]: 2025-11-24 13:22:24.311 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763990529.3093963, df72ea7f-d1a4-4c43-8112-40abfc528851 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:22:24 compute-1 nova_compute[187078]: 2025-11-24 13:22:24.312 187082 INFO nova.compute.manager [-] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] VM Stopped (Lifecycle Event)
Nov 24 13:22:24 compute-1 nova_compute[187078]: 2025-11-24 13:22:24.328 187082 DEBUG nova.compute.manager [None req-ad0ffccb-d0d0-4885-a3b9-b41cc2219344 - - - - - -] [instance: df72ea7f-d1a4-4c43-8112-40abfc528851] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:22:25 compute-1 nova_compute[187078]: 2025-11-24 13:22:25.056 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:26 compute-1 podman[210865]: 2025-11-24 13:22:26.54631636 +0000 UTC m=+0.081805111 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:22:26 compute-1 podman[210866]: 2025-11-24 13:22:26.576635411 +0000 UTC m=+0.107600688 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:22:27 compute-1 nova_compute[187078]: 2025-11-24 13:22:27.408 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:30 compute-1 nova_compute[187078]: 2025-11-24 13:22:30.058 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:32 compute-1 nova_compute[187078]: 2025-11-24 13:22:32.410 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:33 compute-1 podman[210908]: 2025-11-24 13:22:33.502467906 +0000 UTC m=+0.052666973 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:22:33 compute-1 podman[210909]: 2025-11-24 13:22:33.528635864 +0000 UTC m=+0.077980257 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 24 13:22:35 compute-1 nova_compute[187078]: 2025-11-24 13:22:35.060 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:35 compute-1 podman[197429]: time="2025-11-24T13:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:22:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:22:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Nov 24 13:22:37 compute-1 nova_compute[187078]: 2025-11-24 13:22:37.413 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:38 compute-1 nova_compute[187078]: 2025-11-24 13:22:38.974 187082 DEBUG nova.compute.manager [None req-e6f108fe-3d87-4843-b0fd-328aa2ceae4c f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Nov 24 13:22:39 compute-1 nova_compute[187078]: 2025-11-24 13:22:39.033 187082 DEBUG nova.compute.provider_tree [None req-e6f108fe-3d87-4843-b0fd-328aa2ceae4c f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 14 to 17 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:22:40 compute-1 nova_compute[187078]: 2025-11-24 13:22:40.062 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:40 compute-1 podman[210954]: 2025-11-24 13:22:40.508719505 +0000 UTC m=+0.053671280 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 24 13:22:41 compute-1 sshd-session[210952]: Invalid user user2 from 175.100.24.139 port 52248
Nov 24 13:22:41 compute-1 sshd-session[210952]: Received disconnect from 175.100.24.139 port 52248:11: Bye Bye [preauth]
Nov 24 13:22:41 compute-1 sshd-session[210952]: Disconnected from invalid user user2 175.100.24.139 port 52248 [preauth]
Nov 24 13:22:41 compute-1 sshd-session[210975]: Invalid user svn from 68.183.82.237 port 41906
Nov 24 13:22:42 compute-1 sshd-session[210975]: Received disconnect from 68.183.82.237 port 41906:11: Bye Bye [preauth]
Nov 24 13:22:42 compute-1 sshd-session[210975]: Disconnected from invalid user svn 68.183.82.237 port 41906 [preauth]
Nov 24 13:22:42 compute-1 nova_compute[187078]: 2025-11-24 13:22:42.415 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:43 compute-1 nova_compute[187078]: 2025-11-24 13:22:43.423 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:45 compute-1 nova_compute[187078]: 2025-11-24 13:22:45.064 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:47 compute-1 nova_compute[187078]: 2025-11-24 13:22:47.417 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:47 compute-1 sshd-session[210977]: Received disconnect from 85.209.134.43 port 41328:11: Bye Bye [preauth]
Nov 24 13:22:47 compute-1 sshd-session[210977]: Disconnected from authenticating user root 85.209.134.43 port 41328 [preauth]
Nov 24 13:22:49 compute-1 openstack_network_exporter[199599]: ERROR   13:22:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:22:49 compute-1 openstack_network_exporter[199599]: ERROR   13:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:22:49 compute-1 openstack_network_exporter[199599]: ERROR   13:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:22:49 compute-1 openstack_network_exporter[199599]: ERROR   13:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:22:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:22:49 compute-1 openstack_network_exporter[199599]: ERROR   13:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:22:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:22:50 compute-1 nova_compute[187078]: 2025-11-24 13:22:50.066 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:52 compute-1 nova_compute[187078]: 2025-11-24 13:22:52.418 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:53 compute-1 sshd-session[210475]: Connection closed by 45.78.217.131 port 60882 [preauth]
Nov 24 13:22:55 compute-1 nova_compute[187078]: 2025-11-24 13:22:55.068 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:57 compute-1 nova_compute[187078]: 2025-11-24 13:22:57.420 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:22:57 compute-1 podman[210980]: 2025-11-24 13:22:57.545968047 +0000 UTC m=+0.077219576 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:22:57 compute-1 podman[210981]: 2025-11-24 13:22:57.579896816 +0000 UTC m=+0.110529977 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:23:00 compute-1 nova_compute[187078]: 2025-11-24 13:23:00.071 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:00 compute-1 sshd-session[211021]: Invalid user intell from 5.198.176.28 port 43914
Nov 24 13:23:00 compute-1 sshd-session[211021]: Received disconnect from 5.198.176.28 port 43914:11: Bye Bye [preauth]
Nov 24 13:23:00 compute-1 sshd-session[211021]: Disconnected from invalid user intell 5.198.176.28 port 43914 [preauth]
Nov 24 13:23:02 compute-1 nova_compute[187078]: 2025-11-24 13:23:02.421 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:03 compute-1 sshd-session[211023]: Invalid user sol from 45.148.10.240 port 52944
Nov 24 13:23:03 compute-1 sshd-session[211023]: Connection closed by invalid user sol 45.148.10.240 port 52944 [preauth]
Nov 24 13:23:03 compute-1 nova_compute[187078]: 2025-11-24 13:23:03.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:23:04.151 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:23:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:23:04.152 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:23:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:23:04.152 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:23:04 compute-1 podman[211025]: 2025-11-24 13:23:04.53193709 +0000 UTC m=+0.075821598 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:23:04 compute-1 podman[211026]: 2025-11-24 13:23:04.592074627 +0000 UTC m=+0.121985092 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:23:05 compute-1 nova_compute[187078]: 2025-11-24 13:23:05.073 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:05 compute-1 podman[197429]: time="2025-11-24T13:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:23:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:23:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Nov 24 13:23:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:23:06.561 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:23:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:23:06.562 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:23:06 compute-1 nova_compute[187078]: 2025-11-24 13:23:06.599 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:06 compute-1 nova_compute[187078]: 2025-11-24 13:23:06.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:07 compute-1 nova_compute[187078]: 2025-11-24 13:23:07.423 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:07 compute-1 nova_compute[187078]: 2025-11-24 13:23:07.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:07 compute-1 nova_compute[187078]: 2025-11-24 13:23:07.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:08 compute-1 nova_compute[187078]: 2025-11-24 13:23:08.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.691 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.692 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.693 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.729 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.729 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.729 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.730 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.883 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.884 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5887MB free_disk=73.46193313598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.884 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.885 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.955 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.956 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.978 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.993 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.995 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:23:09 compute-1 nova_compute[187078]: 2025-11-24 13:23:09.995 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:23:10 compute-1 nova_compute[187078]: 2025-11-24 13:23:10.075 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:11 compute-1 podman[211072]: 2025-11-24 13:23:11.535927747 +0000 UTC m=+0.076815474 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 24 13:23:11 compute-1 nova_compute[187078]: 2025-11-24 13:23:11.969 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:23:11 compute-1 nova_compute[187078]: 2025-11-24 13:23:11.969 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:23:12 compute-1 nova_compute[187078]: 2025-11-24 13:23:12.426 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:15 compute-1 nova_compute[187078]: 2025-11-24 13:23:15.077 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:16 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:23:16.565 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:23:17 compute-1 nova_compute[187078]: 2025-11-24 13:23:17.429 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:18 compute-1 sshd-session[211093]: Invalid user casaos from 176.114.89.34 port 55292
Nov 24 13:23:18 compute-1 sshd-session[211093]: Received disconnect from 176.114.89.34 port 55292:11: Bye Bye [preauth]
Nov 24 13:23:18 compute-1 sshd-session[211093]: Disconnected from invalid user casaos 176.114.89.34 port 55292 [preauth]
Nov 24 13:23:19 compute-1 openstack_network_exporter[199599]: ERROR   13:23:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:23:19 compute-1 openstack_network_exporter[199599]: ERROR   13:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:23:19 compute-1 openstack_network_exporter[199599]: ERROR   13:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:23:19 compute-1 openstack_network_exporter[199599]: ERROR   13:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:23:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:23:19 compute-1 openstack_network_exporter[199599]: ERROR   13:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:23:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:23:19 compute-1 ovn_controller[95368]: 2025-11-24T13:23:19Z|00094|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 24 13:23:20 compute-1 nova_compute[187078]: 2025-11-24 13:23:20.079 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:22 compute-1 nova_compute[187078]: 2025-11-24 13:23:22.433 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:25 compute-1 nova_compute[187078]: 2025-11-24 13:23:25.082 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:27 compute-1 nova_compute[187078]: 2025-11-24 13:23:27.434 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:28 compute-1 podman[211096]: 2025-11-24 13:23:28.513674903 +0000 UTC m=+0.049194966 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 24 13:23:28 compute-1 podman[211095]: 2025-11-24 13:23:28.526064048 +0000 UTC m=+0.065570459 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:23:30 compute-1 nova_compute[187078]: 2025-11-24 13:23:30.085 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:32 compute-1 nova_compute[187078]: 2025-11-24 13:23:32.436 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:35 compute-1 nova_compute[187078]: 2025-11-24 13:23:35.087 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:35 compute-1 podman[211138]: 2025-11-24 13:23:35.506656627 +0000 UTC m=+0.051891220 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:23:35 compute-1 podman[211139]: 2025-11-24 13:23:35.569668184 +0000 UTC m=+0.109941353 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 13:23:35 compute-1 podman[197429]: time="2025-11-24T13:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:23:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:23:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Nov 24 13:23:37 compute-1 nova_compute[187078]: 2025-11-24 13:23:37.438 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:40 compute-1 nova_compute[187078]: 2025-11-24 13:23:40.089 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:42 compute-1 nova_compute[187078]: 2025-11-24 13:23:42.441 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:42 compute-1 podman[211186]: 2025-11-24 13:23:42.530072688 +0000 UTC m=+0.077869118 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 24 13:23:45 compute-1 sshd-session[211207]: Invalid user customer from 85.209.134.43 port 33064
Nov 24 13:23:45 compute-1 sshd-session[211207]: Received disconnect from 85.209.134.43 port 33064:11: Bye Bye [preauth]
Nov 24 13:23:45 compute-1 sshd-session[211207]: Disconnected from invalid user customer 85.209.134.43 port 33064 [preauth]
Nov 24 13:23:45 compute-1 nova_compute[187078]: 2025-11-24 13:23:45.092 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:45 compute-1 sshd-session[211184]: Connection closed by 45.78.217.131 port 58790 [preauth]
Nov 24 13:23:47 compute-1 nova_compute[187078]: 2025-11-24 13:23:47.443 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:49 compute-1 openstack_network_exporter[199599]: ERROR   13:23:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:23:49 compute-1 openstack_network_exporter[199599]: ERROR   13:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:23:49 compute-1 openstack_network_exporter[199599]: ERROR   13:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:23:49 compute-1 openstack_network_exporter[199599]: ERROR   13:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:23:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:23:49 compute-1 openstack_network_exporter[199599]: ERROR   13:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:23:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:23:50 compute-1 nova_compute[187078]: 2025-11-24 13:23:50.094 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:52 compute-1 nova_compute[187078]: 2025-11-24 13:23:52.445 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:55 compute-1 nova_compute[187078]: 2025-11-24 13:23:55.095 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:57 compute-1 nova_compute[187078]: 2025-11-24 13:23:57.447 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:23:58 compute-1 sshd-session[211209]: Invalid user sam from 68.183.82.237 port 57328
Nov 24 13:23:58 compute-1 sshd-session[211209]: Received disconnect from 68.183.82.237 port 57328:11: Bye Bye [preauth]
Nov 24 13:23:58 compute-1 sshd-session[211209]: Disconnected from invalid user sam 68.183.82.237 port 57328 [preauth]
Nov 24 13:23:59 compute-1 sshd-session[211211]: Invalid user doge from 193.32.162.145 port 41358
Nov 24 13:23:59 compute-1 podman[211214]: 2025-11-24 13:23:59.398606988 +0000 UTC m=+0.064072937 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 24 13:23:59 compute-1 podman[211213]: 2025-11-24 13:23:59.40311891 +0000 UTC m=+0.077738726 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:23:59 compute-1 sshd-session[211211]: Connection closed by invalid user doge 193.32.162.145 port 41358 [preauth]
Nov 24 13:24:00 compute-1 nova_compute[187078]: 2025-11-24 13:24:00.098 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:02 compute-1 nova_compute[187078]: 2025-11-24 13:24:02.449 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:03 compute-1 nova_compute[187078]: 2025-11-24 13:24:03.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:04.152 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:04.152 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:04.153 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:05 compute-1 nova_compute[187078]: 2025-11-24 13:24:05.100 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:05 compute-1 podman[197429]: time="2025-11-24T13:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:24:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:24:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Nov 24 13:24:06 compute-1 podman[211254]: 2025-11-24 13:24:06.584473546 +0000 UTC m=+0.119356086 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:24:06 compute-1 podman[211255]: 2025-11-24 13:24:06.591915436 +0000 UTC m=+0.125374738 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 13:24:07 compute-1 nova_compute[187078]: 2025-11-24 13:24:07.450 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:07 compute-1 nova_compute[187078]: 2025-11-24 13:24:07.665 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:07 compute-1 nova_compute[187078]: 2025-11-24 13:24:07.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:08 compute-1 sshd-session[211298]: Received disconnect from 5.198.176.28 port 44022:11: Bye Bye [preauth]
Nov 24 13:24:08 compute-1 sshd-session[211298]: Disconnected from authenticating user root 5.198.176.28 port 44022 [preauth]
Nov 24 13:24:08 compute-1 nova_compute[187078]: 2025-11-24 13:24:08.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:08.936 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:24:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:08.937 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:24:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:08.939 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:08 compute-1 nova_compute[187078]: 2025-11-24 13:24:08.941 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:09 compute-1 nova_compute[187078]: 2025-11-24 13:24:09.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:09 compute-1 nova_compute[187078]: 2025-11-24 13:24:09.963 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:09 compute-1 nova_compute[187078]: 2025-11-24 13:24:09.964 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:09 compute-1 nova_compute[187078]: 2025-11-24 13:24:09.964 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:09 compute-1 nova_compute[187078]: 2025-11-24 13:24:09.965 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.102 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.231 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.233 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5892MB free_disk=73.46195983886719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.234 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.234 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.296 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.296 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.318 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.338 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.339 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.356 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.378 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.403 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.416 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.418 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:24:10 compute-1 nova_compute[187078]: 2025-11-24 13:24:10.419 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:11 compute-1 nova_compute[187078]: 2025-11-24 13:24:11.420 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:11 compute-1 nova_compute[187078]: 2025-11-24 13:24:11.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:11 compute-1 nova_compute[187078]: 2025-11-24 13:24:11.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:24:11 compute-1 nova_compute[187078]: 2025-11-24 13:24:11.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:24:11 compute-1 nova_compute[187078]: 2025-11-24 13:24:11.681 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:24:11 compute-1 nova_compute[187078]: 2025-11-24 13:24:11.681 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:11 compute-1 nova_compute[187078]: 2025-11-24 13:24:11.682 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:11 compute-1 nova_compute[187078]: 2025-11-24 13:24:11.682 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:24:12 compute-1 nova_compute[187078]: 2025-11-24 13:24:12.452 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:13 compute-1 podman[211301]: 2025-11-24 13:24:13.52930239 +0000 UTC m=+0.069053850 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Nov 24 13:24:13 compute-1 nova_compute[187078]: 2025-11-24 13:24:13.676 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:24:15 compute-1 nova_compute[187078]: 2025-11-24 13:24:15.103 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:17 compute-1 nova_compute[187078]: 2025-11-24 13:24:17.454 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.219 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.220 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.239 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.316 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.317 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.325 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.325 187082 INFO nova.compute.claims [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.427 187082 DEBUG nova.compute.provider_tree [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.441 187082 DEBUG nova.scheduler.client.report [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.458 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.458 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.500 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.501 187082 DEBUG nova.network.neutron [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.518 187082 INFO nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.532 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.650 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.651 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.652 187082 INFO nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Creating image(s)
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.653 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "/var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.653 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "/var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.654 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "/var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.684 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.747 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.749 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.750 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.765 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.820 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.822 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.855 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.856 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.857 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.911 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.911 187082 DEBUG nova.virt.disk.api [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Checking if we can resize image /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.912 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.974 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.975 187082 DEBUG nova.virt.disk.api [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Cannot resize image /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:24:18 compute-1 nova_compute[187078]: 2025-11-24 13:24:18.975 187082 DEBUG nova.objects.instance [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lazy-loading 'migration_context' on Instance uuid c0bb309a-a2fc-4698-aff6-bfdabaaf0be0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:24:19 compute-1 nova_compute[187078]: 2025-11-24 13:24:19.003 187082 DEBUG nova.policy [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d308744a09d4f178c98b8819242cf19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eee9e916301c41549b22b5a9425bdedd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:24:19 compute-1 nova_compute[187078]: 2025-11-24 13:24:19.006 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:24:19 compute-1 nova_compute[187078]: 2025-11-24 13:24:19.006 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Ensure instance console log exists: /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:24:19 compute-1 nova_compute[187078]: 2025-11-24 13:24:19.007 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:19 compute-1 nova_compute[187078]: 2025-11-24 13:24:19.007 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:19 compute-1 nova_compute[187078]: 2025-11-24 13:24:19.007 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:19 compute-1 openstack_network_exporter[199599]: ERROR   13:24:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:24:19 compute-1 openstack_network_exporter[199599]: ERROR   13:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:24:19 compute-1 openstack_network_exporter[199599]: ERROR   13:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:24:19 compute-1 openstack_network_exporter[199599]: ERROR   13:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:24:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:24:19 compute-1 openstack_network_exporter[199599]: ERROR   13:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:24:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:24:19 compute-1 nova_compute[187078]: 2025-11-24 13:24:19.695 187082 DEBUG nova.network.neutron [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Successfully created port: cc1251ff-b617-4a9f-a77b-a84ac3b48832 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.105 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.595 187082 DEBUG nova.network.neutron [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Successfully updated port: cc1251ff-b617-4a9f-a77b-a84ac3b48832 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.611 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "refresh_cache-c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.611 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquired lock "refresh_cache-c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.612 187082 DEBUG nova.network.neutron [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.685 187082 DEBUG nova.compute.manager [req-c2398415-9717-44e7-b779-720e4d8b1c55 req-a709e487-22c2-46c2-8a98-11e410048915 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received event network-changed-cc1251ff-b617-4a9f-a77b-a84ac3b48832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.686 187082 DEBUG nova.compute.manager [req-c2398415-9717-44e7-b779-720e4d8b1c55 req-a709e487-22c2-46c2-8a98-11e410048915 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Refreshing instance network info cache due to event network-changed-cc1251ff-b617-4a9f-a77b-a84ac3b48832. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.686 187082 DEBUG oslo_concurrency.lockutils [req-c2398415-9717-44e7-b779-720e4d8b1c55 req-a709e487-22c2-46c2-8a98-11e410048915 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:24:20 compute-1 nova_compute[187078]: 2025-11-24 13:24:20.950 187082 DEBUG nova.network.neutron [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:24:21 compute-1 nova_compute[187078]: 2025-11-24 13:24:21.964 187082 DEBUG nova.network.neutron [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Updating instance_info_cache with network_info: [{"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:24:21 compute-1 nova_compute[187078]: 2025-11-24 13:24:21.987 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Releasing lock "refresh_cache-c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:24:21 compute-1 nova_compute[187078]: 2025-11-24 13:24:21.988 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Instance network_info: |[{"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:24:21 compute-1 nova_compute[187078]: 2025-11-24 13:24:21.988 187082 DEBUG oslo_concurrency.lockutils [req-c2398415-9717-44e7-b779-720e4d8b1c55 req-a709e487-22c2-46c2-8a98-11e410048915 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:24:21 compute-1 nova_compute[187078]: 2025-11-24 13:24:21.989 187082 DEBUG nova.network.neutron [req-c2398415-9717-44e7-b779-720e4d8b1c55 req-a709e487-22c2-46c2-8a98-11e410048915 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Refreshing network info cache for port cc1251ff-b617-4a9f-a77b-a84ac3b48832 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:24:21 compute-1 nova_compute[187078]: 2025-11-24 13:24:21.992 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Start _get_guest_xml network_info=[{"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:24:21 compute-1 nova_compute[187078]: 2025-11-24 13:24:21.998 187082 WARNING nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.004 187082 DEBUG nova.virt.libvirt.host [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.005 187082 DEBUG nova.virt.libvirt.host [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.011 187082 DEBUG nova.virt.libvirt.host [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.012 187082 DEBUG nova.virt.libvirt.host [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.013 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.014 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.015 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.015 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.015 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.016 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.016 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.017 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.017 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.017 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.018 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.018 187082 DEBUG nova.virt.hardware [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.023 187082 DEBUG nova.virt.libvirt.vif [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:24:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1227850426',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1227850426',id=10,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eee9e916301c41549b22b5a9425bdedd',ramdisk_id='',reservation_id='r-a9iwxz9r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:24:18Z,user_data=None,user_id='2d308744a09d4f178c98b8819242cf19',uuid=c0bb309a-a2fc-4698-aff6-bfdabaaf0be0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.023 187082 DEBUG nova.network.os_vif_util [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Converting VIF {"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.024 187082 DEBUG nova.network.os_vif_util [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:9e:c7,bridge_name='br-int',has_traffic_filtering=True,id=cc1251ff-b617-4a9f-a77b-a84ac3b48832,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc1251ff-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.025 187082 DEBUG nova.objects.instance [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lazy-loading 'pci_devices' on Instance uuid c0bb309a-a2fc-4698-aff6-bfdabaaf0be0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.051 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <uuid>c0bb309a-a2fc-4698-aff6-bfdabaaf0be0</uuid>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <name>instance-0000000a</name>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1227850426</nova:name>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:24:22</nova:creationTime>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:24:22 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:24:22 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:24:22 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:24:22 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:24:22 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:24:22 compute-1 nova_compute[187078]:         <nova:user uuid="2d308744a09d4f178c98b8819242cf19">tempest-TestExecuteHostMaintenanceStrategy-1998539988-project-member</nova:user>
Nov 24 13:24:22 compute-1 nova_compute[187078]:         <nova:project uuid="eee9e916301c41549b22b5a9425bdedd">tempest-TestExecuteHostMaintenanceStrategy-1998539988</nova:project>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:24:22 compute-1 nova_compute[187078]:         <nova:port uuid="cc1251ff-b617-4a9f-a77b-a84ac3b48832">
Nov 24 13:24:22 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <system>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <entry name="serial">c0bb309a-a2fc-4698-aff6-bfdabaaf0be0</entry>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <entry name="uuid">c0bb309a-a2fc-4698-aff6-bfdabaaf0be0</entry>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </system>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <os>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   </os>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <features>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   </features>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk.config"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:d9:9e:c7"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <target dev="tapcc1251ff-b6"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/console.log" append="off"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <video>
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </video>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:24:22 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:24:22 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:24:22 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:24:22 compute-1 nova_compute[187078]: </domain>
Nov 24 13:24:22 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.052 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Preparing to wait for external event network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.053 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.053 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.054 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.054 187082 DEBUG nova.virt.libvirt.vif [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:24:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1227850426',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1227850426',id=10,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eee9e916301c41549b22b5a9425bdedd',ramdisk_id='',reservation_id='r-a9iwxz9r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:24:18Z,user_data=None,user_id='2d308744a09d4f178c98b8819242cf19',uuid=c0bb309a-a2fc-4698-aff6-bfdabaaf0be0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.055 187082 DEBUG nova.network.os_vif_util [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Converting VIF {"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.056 187082 DEBUG nova.network.os_vif_util [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:9e:c7,bridge_name='br-int',has_traffic_filtering=True,id=cc1251ff-b617-4a9f-a77b-a84ac3b48832,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc1251ff-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.056 187082 DEBUG os_vif [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:9e:c7,bridge_name='br-int',has_traffic_filtering=True,id=cc1251ff-b617-4a9f-a77b-a84ac3b48832,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc1251ff-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.057 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.058 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.058 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.065 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.065 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc1251ff-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.066 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc1251ff-b6, col_values=(('external_ids', {'iface-id': 'cc1251ff-b617-4a9f-a77b-a84ac3b48832', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:9e:c7', 'vm-uuid': 'c0bb309a-a2fc-4698-aff6-bfdabaaf0be0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.070 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 NetworkManager[55527]: <info>  [1763990662.0727] manager: (tapcc1251ff-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.074 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.078 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.080 187082 INFO os_vif [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:9e:c7,bridge_name='br-int',has_traffic_filtering=True,id=cc1251ff-b617-4a9f-a77b-a84ac3b48832,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc1251ff-b6')
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.130 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.131 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.131 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] No VIF found with MAC fa:16:3e:d9:9e:c7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.132 187082 INFO nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Using config drive
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.392 187082 INFO nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Creating config drive at /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk.config
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.402 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_ikm7aw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.458 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.556 187082 DEBUG oslo_concurrency.processutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_ikm7aw" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:22 compute-1 kernel: tapcc1251ff-b6: entered promiscuous mode
Nov 24 13:24:22 compute-1 NetworkManager[55527]: <info>  [1763990662.6462] manager: (tapcc1251ff-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Nov 24 13:24:22 compute-1 ovn_controller[95368]: 2025-11-24T13:24:22Z|00095|binding|INFO|Claiming lport cc1251ff-b617-4a9f-a77b-a84ac3b48832 for this chassis.
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.646 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 ovn_controller[95368]: 2025-11-24T13:24:22Z|00096|binding|INFO|cc1251ff-b617-4a9f-a77b-a84ac3b48832: Claiming fa:16:3e:d9:9e:c7 10.100.0.12
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.651 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.660 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:9e:c7 10.100.0.12'], port_security=['fa:16:3e:d9:9e:c7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c0bb309a-a2fc-4698-aff6-bfdabaaf0be0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ad71987-bbb3-4172-91ba-9872dff838b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eee9e916301c41549b22b5a9425bdedd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5a3fd9b-1501-464e-8687-13e1f71d8c03', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79adaac3-cb6f-4412-8999-d9184f3887dc, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=cc1251ff-b617-4a9f-a77b-a84ac3b48832) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.661 104225 INFO neutron.agent.ovn.metadata.agent [-] Port cc1251ff-b617-4a9f-a77b-a84ac3b48832 in datapath 6ad71987-bbb3-4172-91ba-9872dff838b6 bound to our chassis
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.663 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ad71987-bbb3-4172-91ba-9872dff838b6
Nov 24 13:24:22 compute-1 systemd-udevd[211356]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.680 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f308f802-f4c5-4882-8935-7fc280e511f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.682 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6ad71987-b1 in ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.684 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6ad71987-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.684 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e5709532-4bc0-457a-9d0f-3acfbf36f830]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.687 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[54dcdb64-57e8-450b-badf-f27672bc5c1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 NetworkManager[55527]: <info>  [1763990662.6978] device (tapcc1251ff-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:24:22 compute-1 NetworkManager[55527]: <info>  [1763990662.6988] device (tapcc1251ff-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.702 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[36c75273-dcd8-4382-8392-f1c7a24eaf0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.704 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 systemd-machined[153355]: New machine qemu-7-instance-0000000a.
Nov 24 13:24:22 compute-1 ovn_controller[95368]: 2025-11-24T13:24:22Z|00097|binding|INFO|Setting lport cc1251ff-b617-4a9f-a77b-a84ac3b48832 ovn-installed in OVS
Nov 24 13:24:22 compute-1 ovn_controller[95368]: 2025-11-24T13:24:22Z|00098|binding|INFO|Setting lport cc1251ff-b617-4a9f-a77b-a84ac3b48832 up in Southbound
Nov 24 13:24:22 compute-1 nova_compute[187078]: 2025-11-24 13:24:22.712 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:22 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-0000000a.
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.729 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5523c49b-f941-4574-bb6b-9305d7229fa6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.767 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[94a61b0e-61f2-4410-899c-3a67823c76dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.774 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebfe99c-7efa-4750-b335-ce86438cf647]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 NetworkManager[55527]: <info>  [1763990662.7756] manager: (tap6ad71987-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/42)
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.809 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[64300952-00e6-436b-9013-cc898137bea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.815 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b0849a-8cb4-4651-b71e-661a9029cc6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 NetworkManager[55527]: <info>  [1763990662.8428] device (tap6ad71987-b0): carrier: link connected
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.849 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[8858c3df-5be7-4aaf-bb5d-cfd6d9e09554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.872 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b0be549e-edbd-483d-b7dd-ba35b2007dd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ad71987-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:c3:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368150, 'reachable_time': 22350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211392, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.897 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[00149ec5-106b-4ae1-a38b-1f4d40505320]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:c3cd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368150, 'tstamp': 368150}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211393, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.921 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3025ba5f-151c-454c-97a2-9d3077f5965d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ad71987-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:c3:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368150, 'reachable_time': 22350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211394, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:22 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:22.960 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[db00c353-86c8-4f24-90fe-66aef8ef3636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.029 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[23314b5d-acd4-4623-a2af-ced1c2ae1093]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.032 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ad71987-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.032 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.033 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ad71987-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:23 compute-1 NetworkManager[55527]: <info>  [1763990663.0697] manager: (tap6ad71987-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Nov 24 13:24:23 compute-1 kernel: tap6ad71987-b0: entered promiscuous mode
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.070 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.075 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ad71987-b0, col_values=(('external_ids', {'iface-id': '1c6ca86f-d804-429c-9f44-ccaef792cf62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:23 compute-1 ovn_controller[95368]: 2025-11-24T13:24:23Z|00099|binding|INFO|Releasing lport 1c6ca86f-d804-429c-9f44-ccaef792cf62 from this chassis (sb_readonly=0)
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.077 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.100 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990663.099547, c0bb309a-a2fc-4698-aff6-bfdabaaf0be0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.101 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] VM Started (Lifecycle Event)
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.105 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6ad71987-bbb3-4172-91ba-9872dff838b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6ad71987-bbb3-4172-91ba-9872dff838b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.106 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[757a2f90-96f1-40b3-95fc-24f03985459a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.107 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-6ad71987-bbb3-4172-91ba-9872dff838b6
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/6ad71987-bbb3-4172-91ba-9872dff838b6.pid.haproxy
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID 6ad71987-bbb3-4172-91ba-9872dff838b6
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.108 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:23 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:24:23.109 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'env', 'PROCESS_TAG=haproxy-6ad71987-bbb3-4172-91ba-9872dff838b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6ad71987-bbb3-4172-91ba-9872dff838b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.112 187082 DEBUG nova.compute.manager [req-49724fa1-c2e4-4cb2-bc23-afafbb3e725e req-935f37de-bd0e-49fb-ac92-4c9cfcbe7690 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received event network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.113 187082 DEBUG oslo_concurrency.lockutils [req-49724fa1-c2e4-4cb2-bc23-afafbb3e725e req-935f37de-bd0e-49fb-ac92-4c9cfcbe7690 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.114 187082 DEBUG oslo_concurrency.lockutils [req-49724fa1-c2e4-4cb2-bc23-afafbb3e725e req-935f37de-bd0e-49fb-ac92-4c9cfcbe7690 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.114 187082 DEBUG oslo_concurrency.lockutils [req-49724fa1-c2e4-4cb2-bc23-afafbb3e725e req-935f37de-bd0e-49fb-ac92-4c9cfcbe7690 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.115 187082 DEBUG nova.compute.manager [req-49724fa1-c2e4-4cb2-bc23-afafbb3e725e req-935f37de-bd0e-49fb-ac92-4c9cfcbe7690 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Processing event network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.116 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.122 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.123 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.129 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.135 187082 INFO nova.virt.libvirt.driver [-] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Instance spawned successfully.
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.135 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.152 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.153 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990663.1000528, c0bb309a-a2fc-4698-aff6-bfdabaaf0be0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.153 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] VM Paused (Lifecycle Event)
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.167 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.168 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.169 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.169 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.170 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.170 187082 DEBUG nova.virt.libvirt.driver [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.174 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.181 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990663.1233025, c0bb309a-a2fc-4698-aff6-bfdabaaf0be0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.182 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] VM Resumed (Lifecycle Event)
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.200 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.204 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.224 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.231 187082 INFO nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Took 4.58 seconds to spawn the instance on the hypervisor.
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.231 187082 DEBUG nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.313 187082 INFO nova.compute.manager [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Took 5.02 seconds to build instance.
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.326 187082 DEBUG nova.network.neutron [req-c2398415-9717-44e7-b779-720e4d8b1c55 req-a709e487-22c2-46c2-8a98-11e410048915 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Updated VIF entry in instance network info cache for port cc1251ff-b617-4a9f-a77b-a84ac3b48832. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.326 187082 DEBUG nova.network.neutron [req-c2398415-9717-44e7-b779-720e4d8b1c55 req-a709e487-22c2-46c2-8a98-11e410048915 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Updating instance_info_cache with network_info: [{"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.341 187082 DEBUG oslo_concurrency.lockutils [None req-ffb98d24-3f83-42c6-9930-93808e1a0149 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:23 compute-1 nova_compute[187078]: 2025-11-24 13:24:23.344 187082 DEBUG oslo_concurrency.lockutils [req-c2398415-9717-44e7-b779-720e4d8b1c55 req-a709e487-22c2-46c2-8a98-11e410048915 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:24:23 compute-1 podman[211433]: 2025-11-24 13:24:23.515425149 +0000 UTC m=+0.048340533 container create 7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:24:23 compute-1 systemd[1]: Started libpod-conmon-7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682.scope.
Nov 24 13:24:23 compute-1 podman[211433]: 2025-11-24 13:24:23.492549683 +0000 UTC m=+0.025465087 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:24:23 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:24:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db1b27243cbb50d2001406d59b6b3af9d513ada61a0a653ef69d6c83a93f6b58/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:24:23 compute-1 podman[211433]: 2025-11-24 13:24:23.631562108 +0000 UTC m=+0.164477522 container init 7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:24:23 compute-1 podman[211433]: 2025-11-24 13:24:23.642433861 +0000 UTC m=+0.175349245 container start 7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 13:24:23 compute-1 neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6[211448]: [NOTICE]   (211452) : New worker (211454) forked
Nov 24 13:24:23 compute-1 neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6[211448]: [NOTICE]   (211452) : Loading success.
Nov 24 13:24:25 compute-1 nova_compute[187078]: 2025-11-24 13:24:25.238 187082 DEBUG nova.compute.manager [req-a4850feb-1e1d-451c-9905-d32ee1f685e0 req-60baad1c-b067-4bd3-8513-7f736d2d3c1d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received event network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:24:25 compute-1 nova_compute[187078]: 2025-11-24 13:24:25.238 187082 DEBUG oslo_concurrency.lockutils [req-a4850feb-1e1d-451c-9905-d32ee1f685e0 req-60baad1c-b067-4bd3-8513-7f736d2d3c1d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:25 compute-1 nova_compute[187078]: 2025-11-24 13:24:25.239 187082 DEBUG oslo_concurrency.lockutils [req-a4850feb-1e1d-451c-9905-d32ee1f685e0 req-60baad1c-b067-4bd3-8513-7f736d2d3c1d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:25 compute-1 nova_compute[187078]: 2025-11-24 13:24:25.239 187082 DEBUG oslo_concurrency.lockutils [req-a4850feb-1e1d-451c-9905-d32ee1f685e0 req-60baad1c-b067-4bd3-8513-7f736d2d3c1d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:25 compute-1 nova_compute[187078]: 2025-11-24 13:24:25.239 187082 DEBUG nova.compute.manager [req-a4850feb-1e1d-451c-9905-d32ee1f685e0 req-60baad1c-b067-4bd3-8513-7f736d2d3c1d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] No waiting events found dispatching network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:24:25 compute-1 nova_compute[187078]: 2025-11-24 13:24:25.239 187082 WARNING nova.compute.manager [req-a4850feb-1e1d-451c-9905-d32ee1f685e0 req-60baad1c-b067-4bd3-8513-7f736d2d3c1d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received unexpected event network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 for instance with vm_state active and task_state None.
Nov 24 13:24:26 compute-1 sshd-session[211463]: Received disconnect from 176.114.89.34 port 34912:11: Bye Bye [preauth]
Nov 24 13:24:26 compute-1 sshd-session[211463]: Disconnected from authenticating user root 176.114.89.34 port 34912 [preauth]
Nov 24 13:24:27 compute-1 nova_compute[187078]: 2025-11-24 13:24:27.074 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:27 compute-1 nova_compute[187078]: 2025-11-24 13:24:27.462 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:29 compute-1 podman[211466]: 2025-11-24 13:24:29.529518405 +0000 UTC m=+0.071761704 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:24:29 compute-1 podman[211465]: 2025-11-24 13:24:29.567322494 +0000 UTC m=+0.100893370 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:24:32 compute-1 sshd-session[211508]: Received disconnect from 175.100.24.139 port 54330:11: Bye Bye [preauth]
Nov 24 13:24:32 compute-1 sshd-session[211508]: Disconnected from authenticating user root 175.100.24.139 port 54330 [preauth]
Nov 24 13:24:32 compute-1 nova_compute[187078]: 2025-11-24 13:24:32.140 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:32 compute-1 nova_compute[187078]: 2025-11-24 13:24:32.467 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:35 compute-1 podman[197429]: time="2025-11-24T13:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:24:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:24:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3050 "" "Go-http-client/1.1"
Nov 24 13:24:37 compute-1 nova_compute[187078]: 2025-11-24 13:24:37.186 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:37 compute-1 ovn_controller[95368]: 2025-11-24T13:24:37Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:9e:c7 10.100.0.12
Nov 24 13:24:37 compute-1 ovn_controller[95368]: 2025-11-24T13:24:37Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:9e:c7 10.100.0.12
Nov 24 13:24:37 compute-1 podman[211529]: 2025-11-24 13:24:37.296674759 +0000 UTC m=+0.076605535 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:24:37 compute-1 podman[211530]: 2025-11-24 13:24:37.363168719 +0000 UTC m=+0.134770811 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:24:37 compute-1 nova_compute[187078]: 2025-11-24 13:24:37.477 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:42 compute-1 nova_compute[187078]: 2025-11-24 13:24:42.190 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:42 compute-1 nova_compute[187078]: 2025-11-24 13:24:42.480 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:44 compute-1 podman[211577]: 2025-11-24 13:24:44.535553283 +0000 UTC m=+0.081037613 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7)
Nov 24 13:24:44 compute-1 sshd-session[211575]: Received disconnect from 85.209.134.43 port 43804:11: Bye Bye [preauth]
Nov 24 13:24:44 compute-1 sshd-session[211575]: Disconnected from authenticating user root 85.209.134.43 port 43804 [preauth]
Nov 24 13:24:47 compute-1 nova_compute[187078]: 2025-11-24 13:24:47.193 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:47 compute-1 nova_compute[187078]: 2025-11-24 13:24:47.482 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:49 compute-1 openstack_network_exporter[199599]: ERROR   13:24:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:24:49 compute-1 openstack_network_exporter[199599]: ERROR   13:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:24:49 compute-1 openstack_network_exporter[199599]: ERROR   13:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:24:49 compute-1 openstack_network_exporter[199599]: ERROR   13:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:24:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:24:49 compute-1 openstack_network_exporter[199599]: ERROR   13:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:24:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:24:49 compute-1 nova_compute[187078]: 2025-11-24 13:24:49.990 187082 DEBUG nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Creating tmpfile /var/lib/nova/instances/tmpxlv6v8ir to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 24 13:24:50 compute-1 nova_compute[187078]: 2025-11-24 13:24:50.096 187082 DEBUG nova.compute.manager [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxlv6v8ir',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 24 13:24:51 compute-1 nova_compute[187078]: 2025-11-24 13:24:51.594 187082 DEBUG nova.compute.manager [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxlv6v8ir',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b4898324-b2cb-43d3-bf2d-f629af08e51c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 24 13:24:51 compute-1 nova_compute[187078]: 2025-11-24 13:24:51.622 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:24:51 compute-1 nova_compute[187078]: 2025-11-24 13:24:51.623 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:24:51 compute-1 nova_compute[187078]: 2025-11-24 13:24:51.623 187082 DEBUG nova.network.neutron [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.240 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.487 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.762 187082 DEBUG nova.network.neutron [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Updating instance_info_cache with network_info: [{"id": "50972446-3d7b-4bef-80f9-feac76fbba22", "address": "fa:16:3e:fd:64:5c", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50972446-3d", "ovs_interfaceid": "50972446-3d7b-4bef-80f9-feac76fbba22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.780 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.781 187082 DEBUG nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxlv6v8ir',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b4898324-b2cb-43d3-bf2d-f629af08e51c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.782 187082 DEBUG nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Creating instance directory: /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.782 187082 DEBUG nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Creating disk.info with the contents: {'/var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk': 'qcow2', '/var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.782 187082 DEBUG nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.783 187082 DEBUG nova.objects.instance [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b4898324-b2cb-43d3-bf2d-f629af08e51c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.807 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.881 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.883 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.883 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.894 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.960 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:52 compute-1 nova_compute[187078]: 2025-11-24 13:24:52.961 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.001 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.002 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.003 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.063 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.064 187082 DEBUG nova.virt.disk.api [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Checking if we can resize image /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.064 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:53 compute-1 ovn_controller[95368]: 2025-11-24T13:24:53Z|00100|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.128 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.131 187082 DEBUG nova.virt.disk.api [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Cannot resize image /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.131 187082 DEBUG nova.objects.instance [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid b4898324-b2cb-43d3-bf2d-f629af08e51c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.151 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.176 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.178 187082 DEBUG nova.virt.libvirt.volume.remotefs [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk.config to /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.178 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk.config /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.761 187082 DEBUG oslo_concurrency.processutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk.config /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c" returned: 0 in 0.583s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.763 187082 DEBUG nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.766 187082 DEBUG nova.virt.libvirt.vif [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:23:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1855890869',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1855890869',id=9,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:24:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='eee9e916301c41549b22b5a9425bdedd',ramdisk_id='',reservation_id='r-nc29rqra',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:24:08Z,user_data=None,user_id='2d308744a09d4f178c98b8819242cf19',uuid=b4898324-b2cb-43d3-bf2d-f629af08e51c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50972446-3d7b-4bef-80f9-feac76fbba22", "address": "fa:16:3e:fd:64:5c", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap50972446-3d", "ovs_interfaceid": "50972446-3d7b-4bef-80f9-feac76fbba22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.767 187082 DEBUG nova.network.os_vif_util [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "50972446-3d7b-4bef-80f9-feac76fbba22", "address": "fa:16:3e:fd:64:5c", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap50972446-3d", "ovs_interfaceid": "50972446-3d7b-4bef-80f9-feac76fbba22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.769 187082 DEBUG nova.network.os_vif_util [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:64:5c,bridge_name='br-int',has_traffic_filtering=True,id=50972446-3d7b-4bef-80f9-feac76fbba22,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50972446-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.770 187082 DEBUG os_vif [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:64:5c,bridge_name='br-int',has_traffic_filtering=True,id=50972446-3d7b-4bef-80f9-feac76fbba22,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50972446-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.771 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.772 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.773 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.777 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.778 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50972446-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.779 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50972446-3d, col_values=(('external_ids', {'iface-id': '50972446-3d7b-4bef-80f9-feac76fbba22', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:64:5c', 'vm-uuid': 'b4898324-b2cb-43d3-bf2d-f629af08e51c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.781 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:53 compute-1 NetworkManager[55527]: <info>  [1763990693.7827] manager: (tap50972446-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.785 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.789 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.790 187082 INFO os_vif [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:64:5c,bridge_name='br-int',has_traffic_filtering=True,id=50972446-3d7b-4bef-80f9-feac76fbba22,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50972446-3d')
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.791 187082 DEBUG nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 24 13:24:53 compute-1 nova_compute[187078]: 2025-11-24 13:24:53.792 187082 DEBUG nova.compute.manager [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxlv6v8ir',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b4898324-b2cb-43d3-bf2d-f629af08e51c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 24 13:24:54 compute-1 nova_compute[187078]: 2025-11-24 13:24:54.584 187082 DEBUG nova.network.neutron [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Port 50972446-3d7b-4bef-80f9-feac76fbba22 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 24 13:24:54 compute-1 nova_compute[187078]: 2025-11-24 13:24:54.586 187082 DEBUG nova.compute.manager [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxlv6v8ir',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b4898324-b2cb-43d3-bf2d-f629af08e51c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 24 13:24:54 compute-1 systemd[1]: Starting libvirt proxy daemon...
Nov 24 13:24:54 compute-1 systemd[1]: Started libvirt proxy daemon.
Nov 24 13:24:54 compute-1 NetworkManager[55527]: <info>  [1763990694.9227] manager: (tap50972446-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Nov 24 13:24:54 compute-1 kernel: tap50972446-3d: entered promiscuous mode
Nov 24 13:24:54 compute-1 ovn_controller[95368]: 2025-11-24T13:24:54Z|00101|binding|INFO|Claiming lport 50972446-3d7b-4bef-80f9-feac76fbba22 for this additional chassis.
Nov 24 13:24:54 compute-1 nova_compute[187078]: 2025-11-24 13:24:54.926 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:54 compute-1 ovn_controller[95368]: 2025-11-24T13:24:54Z|00102|binding|INFO|50972446-3d7b-4bef-80f9-feac76fbba22: Claiming fa:16:3e:fd:64:5c 10.100.0.14
Nov 24 13:24:54 compute-1 ovn_controller[95368]: 2025-11-24T13:24:54Z|00103|binding|INFO|Setting lport 50972446-3d7b-4bef-80f9-feac76fbba22 ovn-installed in OVS
Nov 24 13:24:54 compute-1 nova_compute[187078]: 2025-11-24 13:24:54.938 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:54 compute-1 nova_compute[187078]: 2025-11-24 13:24:54.941 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:54 compute-1 systemd-udevd[211654]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:24:54 compute-1 systemd-machined[153355]: New machine qemu-8-instance-00000009.
Nov 24 13:24:54 compute-1 NetworkManager[55527]: <info>  [1763990694.9786] device (tap50972446-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:24:54 compute-1 NetworkManager[55527]: <info>  [1763990694.9805] device (tap50972446-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:24:54 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-00000009.
Nov 24 13:24:57 compute-1 nova_compute[187078]: 2025-11-24 13:24:57.000 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990696.9995203, b4898324-b2cb-43d3-bf2d-f629af08e51c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:24:57 compute-1 nova_compute[187078]: 2025-11-24 13:24:57.002 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] VM Started (Lifecycle Event)
Nov 24 13:24:57 compute-1 nova_compute[187078]: 2025-11-24 13:24:57.026 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:24:57 compute-1 nova_compute[187078]: 2025-11-24 13:24:57.487 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:58 compute-1 nova_compute[187078]: 2025-11-24 13:24:58.781 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:24:59 compute-1 nova_compute[187078]: 2025-11-24 13:24:59.119 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990699.1190078, b4898324-b2cb-43d3-bf2d-f629af08e51c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:24:59 compute-1 nova_compute[187078]: 2025-11-24 13:24:59.120 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] VM Resumed (Lifecycle Event)
Nov 24 13:24:59 compute-1 nova_compute[187078]: 2025-11-24 13:24:59.139 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:24:59 compute-1 nova_compute[187078]: 2025-11-24 13:24:59.144 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:24:59 compute-1 nova_compute[187078]: 2025-11-24 13:24:59.162 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Nov 24 13:24:59 compute-1 ovn_controller[95368]: 2025-11-24T13:24:59Z|00104|binding|INFO|Claiming lport 50972446-3d7b-4bef-80f9-feac76fbba22 for this chassis.
Nov 24 13:24:59 compute-1 ovn_controller[95368]: 2025-11-24T13:24:59Z|00105|binding|INFO|50972446-3d7b-4bef-80f9-feac76fbba22: Claiming fa:16:3e:fd:64:5c 10.100.0.14
Nov 24 13:24:59 compute-1 ovn_controller[95368]: 2025-11-24T13:24:59Z|00106|binding|INFO|Setting lport 50972446-3d7b-4bef-80f9-feac76fbba22 up in Southbound
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.003 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:64:5c 10.100.0.14'], port_security=['fa:16:3e:fd:64:5c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b4898324-b2cb-43d3-bf2d-f629af08e51c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ad71987-bbb3-4172-91ba-9872dff838b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eee9e916301c41549b22b5a9425bdedd', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a5a3fd9b-1501-464e-8687-13e1f71d8c03', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79adaac3-cb6f-4412-8999-d9184f3887dc, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=50972446-3d7b-4bef-80f9-feac76fbba22) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.005 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 50972446-3d7b-4bef-80f9-feac76fbba22 in datapath 6ad71987-bbb3-4172-91ba-9872dff838b6 bound to our chassis
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.007 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ad71987-bbb3-4172-91ba-9872dff838b6
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.034 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[d01450f3-5ab6-4542-94e8-ddfaccb35622]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.073 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[5b395601-cefc-40c0-a0e0-822e4d60ddb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.078 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[a16deb30-cc78-48d1-97ee-aa5730c3dc96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.121 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[3d781a69-6867-49fd-b301-44fe36733d88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.144 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9a32b64b-531d-4202-8d60-b8f0314acab5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ad71987-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:c3:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368150, 'reachable_time': 22350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211691, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:00 compute-1 nova_compute[187078]: 2025-11-24 13:25:00.165 187082 INFO nova.compute.manager [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Post operation of migration started
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.166 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec6c7f9-cb6a-4172-a49f-4da4c8df0e1f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6ad71987-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368164, 'tstamp': 368164}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211692, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6ad71987-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368168, 'tstamp': 368168}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211692, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.169 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ad71987-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.172 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ad71987-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.172 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.172 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ad71987-b0, col_values=(('external_ids', {'iface-id': '1c6ca86f-d804-429c-9f44-ccaef792cf62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:00 compute-1 nova_compute[187078]: 2025-11-24 13:25:00.172 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:00.173 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:25:00 compute-1 nova_compute[187078]: 2025-11-24 13:25:00.439 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:25:00 compute-1 nova_compute[187078]: 2025-11-24 13:25:00.440 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:25:00 compute-1 nova_compute[187078]: 2025-11-24 13:25:00.441 187082 DEBUG nova.network.neutron [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:25:00 compute-1 podman[211693]: 2025-11-24 13:25:00.546148988 +0000 UTC m=+0.075121375 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:25:00 compute-1 podman[211694]: 2025-11-24 13:25:00.556084135 +0000 UTC m=+0.085371310 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:25:02 compute-1 nova_compute[187078]: 2025-11-24 13:25:02.490 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:03 compute-1 nova_compute[187078]: 2025-11-24 13:25:03.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:03 compute-1 nova_compute[187078]: 2025-11-24 13:25:03.783 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:03 compute-1 nova_compute[187078]: 2025-11-24 13:25:03.974 187082 DEBUG nova.network.neutron [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Updating instance_info_cache with network_info: [{"id": "50972446-3d7b-4bef-80f9-feac76fbba22", "address": "fa:16:3e:fd:64:5c", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50972446-3d", "ovs_interfaceid": "50972446-3d7b-4bef-80f9-feac76fbba22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:25:03 compute-1 nova_compute[187078]: 2025-11-24 13:25:03.997 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:25:04 compute-1 nova_compute[187078]: 2025-11-24 13:25:04.014 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:04 compute-1 nova_compute[187078]: 2025-11-24 13:25:04.014 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:04 compute-1 nova_compute[187078]: 2025-11-24 13:25:04.015 187082 DEBUG oslo_concurrency.lockutils [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:04 compute-1 nova_compute[187078]: 2025-11-24 13:25:04.020 187082 INFO nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 24 13:25:04 compute-1 virtqemud[186628]: Domain id=8 name='instance-00000009' uuid=b4898324-b2cb-43d3-bf2d-f629af08e51c is tainted: custom-monitor
Nov 24 13:25:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:04.153 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:04.153 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:04.154 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:05 compute-1 nova_compute[187078]: 2025-11-24 13:25:05.031 187082 INFO nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 24 13:25:05 compute-1 podman[197429]: time="2025-11-24T13:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:25:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:25:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Nov 24 13:25:05 compute-1 nova_compute[187078]: 2025-11-24 13:25:05.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:06 compute-1 nova_compute[187078]: 2025-11-24 13:25:06.043 187082 INFO nova.virt.libvirt.driver [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 24 13:25:06 compute-1 nova_compute[187078]: 2025-11-24 13:25:06.049 187082 DEBUG nova.compute.manager [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:25:06 compute-1 nova_compute[187078]: 2025-11-24 13:25:06.066 187082 DEBUG nova.objects.instance [None req-9b4afa87-74a2-4e60-a610-80033f9f9a96 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 24 13:25:07 compute-1 nova_compute[187078]: 2025-11-24 13:25:07.497 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:07 compute-1 podman[211734]: 2025-11-24 13:25:07.555236024 +0000 UTC m=+0.085763822 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:25:07 compute-1 podman[211735]: 2025-11-24 13:25:07.570909146 +0000 UTC m=+0.104725782 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:25:07 compute-1 nova_compute[187078]: 2025-11-24 13:25:07.691 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:08 compute-1 nova_compute[187078]: 2025-11-24 13:25:08.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:08 compute-1 nova_compute[187078]: 2025-11-24 13:25:08.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:08 compute-1 nova_compute[187078]: 2025-11-24 13:25:08.785 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:10 compute-1 sshd-session[211779]: Invalid user sol from 45.148.10.240 port 36318
Nov 24 13:25:10 compute-1 sshd-session[211779]: Connection closed by invalid user sol 45.148.10.240 port 36318 [preauth]
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.690 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.691 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.762 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.855 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.857 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.958 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:25:10 compute-1 nova_compute[187078]: 2025-11-24 13:25:10.966 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.064 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.065 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.134 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.339 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.340 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5556MB free_disk=73.40439224243164GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.340 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.341 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.494 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance c0bb309a-a2fc-4698-aff6-bfdabaaf0be0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.495 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance b4898324-b2cb-43d3-bf2d-f629af08e51c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.495 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.496 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.665 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.678 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.712 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.713 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.855 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.856 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.856 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.856 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.857 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.858 187082 INFO nova.compute.manager [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Terminating instance
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.859 187082 DEBUG nova.compute.manager [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:25:11 compute-1 kernel: tapcc1251ff-b6 (unregistering): left promiscuous mode
Nov 24 13:25:11 compute-1 NetworkManager[55527]: <info>  [1763990711.9222] device (tapcc1251ff-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:25:11 compute-1 ovn_controller[95368]: 2025-11-24T13:25:11Z|00107|binding|INFO|Releasing lport cc1251ff-b617-4a9f-a77b-a84ac3b48832 from this chassis (sb_readonly=0)
Nov 24 13:25:11 compute-1 ovn_controller[95368]: 2025-11-24T13:25:11Z|00108|binding|INFO|Setting lport cc1251ff-b617-4a9f-a77b-a84ac3b48832 down in Southbound
Nov 24 13:25:11 compute-1 ovn_controller[95368]: 2025-11-24T13:25:11Z|00109|binding|INFO|Removing iface tapcc1251ff-b6 ovn-installed in OVS
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.935 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:11 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:11.946 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:9e:c7 10.100.0.12'], port_security=['fa:16:3e:d9:9e:c7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c0bb309a-a2fc-4698-aff6-bfdabaaf0be0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ad71987-bbb3-4172-91ba-9872dff838b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eee9e916301c41549b22b5a9425bdedd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a5a3fd9b-1501-464e-8687-13e1f71d8c03', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79adaac3-cb6f-4412-8999-d9184f3887dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=cc1251ff-b617-4a9f-a77b-a84ac3b48832) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:25:11 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:11.948 104225 INFO neutron.agent.ovn.metadata.agent [-] Port cc1251ff-b617-4a9f-a77b-a84ac3b48832 in datapath 6ad71987-bbb3-4172-91ba-9872dff838b6 unbound from our chassis
Nov 24 13:25:11 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:11.951 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ad71987-bbb3-4172-91ba-9872dff838b6
Nov 24 13:25:11 compute-1 nova_compute[187078]: 2025-11-24 13:25:11.961 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:11 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:11.979 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[7c023d78-1535-4d84-8566-84ef2f499a54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:11 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 24 13:25:11 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Consumed 15.516s CPU time.
Nov 24 13:25:11 compute-1 systemd-machined[153355]: Machine qemu-7-instance-0000000a terminated.
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.024 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[aeb6692c-6b1b-4f03-bb55-ab71717a80b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.029 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[acdc71a4-56c6-4ac5-9fa8-1b341e64ac8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.071 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[273ebc3d-831a-4c39-a23c-4ede1dc68010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.088 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.096 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.102 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e33e31f3-65d7-478d-8bfb-79a660d292b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ad71987-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:c3:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368150, 'reachable_time': 22350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211809, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.128 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3d1718b8-0f92-4c92-a17c-bc395f15408e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6ad71987-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368164, 'tstamp': 368164}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211820, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6ad71987-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368168, 'tstamp': 368168}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211820, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.131 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ad71987-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.132 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.137 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.138 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ad71987-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.138 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.138 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ad71987-b0, col_values=(('external_ids', {'iface-id': '1c6ca86f-d804-429c-9f44-ccaef792cf62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.138 187082 INFO nova.virt.libvirt.driver [-] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Instance destroyed successfully.
Nov 24 13:25:12 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:12.138 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.138 187082 DEBUG nova.objects.instance [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lazy-loading 'resources' on Instance uuid c0bb309a-a2fc-4698-aff6-bfdabaaf0be0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.151 187082 DEBUG nova.virt.libvirt.vif [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:24:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1227850426',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1227850426',id=10,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:24:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eee9e916301c41549b22b5a9425bdedd',ramdisk_id='',reservation_id='r-a9iwxz9r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:24:23Z,user_data=None,user_id='2d308744a09d4f178c98b8819242cf19',uuid=c0bb309a-a2fc-4698-aff6-bfdabaaf0be0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.152 187082 DEBUG nova.network.os_vif_util [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Converting VIF {"id": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "address": "fa:16:3e:d9:9e:c7", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc1251ff-b6", "ovs_interfaceid": "cc1251ff-b617-4a9f-a77b-a84ac3b48832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.152 187082 DEBUG nova.network.os_vif_util [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:9e:c7,bridge_name='br-int',has_traffic_filtering=True,id=cc1251ff-b617-4a9f-a77b-a84ac3b48832,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc1251ff-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.153 187082 DEBUG os_vif [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:9e:c7,bridge_name='br-int',has_traffic_filtering=True,id=cc1251ff-b617-4a9f-a77b-a84ac3b48832,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc1251ff-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.154 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.155 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc1251ff-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.156 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.158 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.163 187082 INFO os_vif [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:9e:c7,bridge_name='br-int',has_traffic_filtering=True,id=cc1251ff-b617-4a9f-a77b-a84ac3b48832,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc1251ff-b6')
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.164 187082 INFO nova.virt.libvirt.driver [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Deleting instance files /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0_del
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.164 187082 INFO nova.virt.libvirt.driver [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Deletion of /var/lib/nova/instances/c0bb309a-a2fc-4698-aff6-bfdabaaf0be0_del complete
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.344 187082 INFO nova.compute.manager [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Took 0.48 seconds to destroy the instance on the hypervisor.
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.345 187082 DEBUG oslo.service.loopingcall [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.345 187082 DEBUG nova.compute.manager [-] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.345 187082 DEBUG nova.network.neutron [-] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.500 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.713 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.714 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.714 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.732 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.972 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.972 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.973 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:25:12 compute-1 nova_compute[187078]: 2025-11-24 13:25:12.973 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b4898324-b2cb-43d3-bf2d-f629af08e51c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.166 187082 DEBUG nova.compute.manager [req-ce24da93-2e71-4b7e-a1c9-b35a1c064c88 req-04b55368-4afd-4776-bf3d-b250a0830c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received event network-vif-unplugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.167 187082 DEBUG oslo_concurrency.lockutils [req-ce24da93-2e71-4b7e-a1c9-b35a1c064c88 req-04b55368-4afd-4776-bf3d-b250a0830c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.167 187082 DEBUG oslo_concurrency.lockutils [req-ce24da93-2e71-4b7e-a1c9-b35a1c064c88 req-04b55368-4afd-4776-bf3d-b250a0830c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.168 187082 DEBUG oslo_concurrency.lockutils [req-ce24da93-2e71-4b7e-a1c9-b35a1c064c88 req-04b55368-4afd-4776-bf3d-b250a0830c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.168 187082 DEBUG nova.compute.manager [req-ce24da93-2e71-4b7e-a1c9-b35a1c064c88 req-04b55368-4afd-4776-bf3d-b250a0830c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] No waiting events found dispatching network-vif-unplugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.169 187082 DEBUG nova.compute.manager [req-ce24da93-2e71-4b7e-a1c9-b35a1c064c88 req-04b55368-4afd-4776-bf3d-b250a0830c66 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received event network-vif-unplugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.367 187082 DEBUG nova.network.neutron [-] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.385 187082 INFO nova.compute.manager [-] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Took 3.04 seconds to deallocate network for instance.
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.434 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.435 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.512 187082 DEBUG nova.compute.provider_tree [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.528 187082 DEBUG nova.scheduler.client.report [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.547 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:15 compute-1 sshd-session[211826]: Invalid user janice from 68.183.82.237 port 33496
Nov 24 13:25:15 compute-1 podman[211829]: 2025-11-24 13:25:15.557777778 +0000 UTC m=+0.087345275 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.575 187082 INFO nova.scheduler.client.report [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Deleted allocations for instance c0bb309a-a2fc-4698-aff6-bfdabaaf0be0
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.658 187082 DEBUG oslo_concurrency.lockutils [None req-3698ddd5-b937-47dd-a114-8ccc4e6c9ac6 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.711 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Updating instance_info_cache with network_info: [{"id": "50972446-3d7b-4bef-80f9-feac76fbba22", "address": "fa:16:3e:fd:64:5c", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50972446-3d", "ovs_interfaceid": "50972446-3d7b-4bef-80f9-feac76fbba22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.724 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-b4898324-b2cb-43d3-bf2d-f629af08e51c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.725 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.726 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.726 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.727 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.727 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.729 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:15 compute-1 nova_compute[187078]: 2025-11-24 13:25:15.729 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:25:15 compute-1 sshd-session[211826]: Received disconnect from 68.183.82.237 port 33496:11: Bye Bye [preauth]
Nov 24 13:25:15 compute-1 sshd-session[211826]: Disconnected from invalid user janice 68.183.82.237 port 33496 [preauth]
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.304 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "b4898324-b2cb-43d3-bf2d-f629af08e51c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.305 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "b4898324-b2cb-43d3-bf2d-f629af08e51c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.305 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.305 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.306 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.307 187082 INFO nova.compute.manager [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Terminating instance
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.308 187082 DEBUG nova.compute.manager [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:25:16 compute-1 kernel: tap50972446-3d (unregistering): left promiscuous mode
Nov 24 13:25:16 compute-1 NetworkManager[55527]: <info>  [1763990716.3845] device (tap50972446-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:25:16 compute-1 ovn_controller[95368]: 2025-11-24T13:25:16Z|00110|binding|INFO|Releasing lport 50972446-3d7b-4bef-80f9-feac76fbba22 from this chassis (sb_readonly=0)
Nov 24 13:25:16 compute-1 ovn_controller[95368]: 2025-11-24T13:25:16Z|00111|binding|INFO|Setting lport 50972446-3d7b-4bef-80f9-feac76fbba22 down in Southbound
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.393 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:16 compute-1 ovn_controller[95368]: 2025-11-24T13:25:16Z|00112|binding|INFO|Removing iface tap50972446-3d ovn-installed in OVS
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.399 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:16 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:16.406 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:64:5c 10.100.0.14'], port_security=['fa:16:3e:fd:64:5c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b4898324-b2cb-43d3-bf2d-f629af08e51c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ad71987-bbb3-4172-91ba-9872dff838b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eee9e916301c41549b22b5a9425bdedd', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'a5a3fd9b-1501-464e-8687-13e1f71d8c03', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79adaac3-cb6f-4412-8999-d9184f3887dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=50972446-3d7b-4bef-80f9-feac76fbba22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:25:16 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:16.407 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 50972446-3d7b-4bef-80f9-feac76fbba22 in datapath 6ad71987-bbb3-4172-91ba-9872dff838b6 unbound from our chassis
Nov 24 13:25:16 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:16.408 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ad71987-bbb3-4172-91ba-9872dff838b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:25:16 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:16.410 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[82e20d5a-5041-4983-9100-5951e1a5f82c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:16 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:16.410 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6 namespace which is not needed anymore
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.412 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:16 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 24 13:25:16 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000009.scope: Consumed 3.516s CPU time.
Nov 24 13:25:16 compute-1 systemd-machined[153355]: Machine qemu-8-instance-00000009 terminated.
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.580 187082 INFO nova.virt.libvirt.driver [-] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Instance destroyed successfully.
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.580 187082 DEBUG nova.objects.instance [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lazy-loading 'resources' on Instance uuid b4898324-b2cb-43d3-bf2d-f629af08e51c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.596 187082 DEBUG nova.virt.libvirt.vif [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-24T13:23:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1855890869',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1855890869',id=9,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:24:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eee9e916301c41549b22b5a9425bdedd',ramdisk_id='',reservation_id='r-nc29rqra',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1998539988-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:25:06Z,user_data=None,user_id='2d308744a09d4f178c98b8819242cf19',uuid=b4898324-b2cb-43d3-bf2d-f629af08e51c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50972446-3d7b-4bef-80f9-feac76fbba22", "address": "fa:16:3e:fd:64:5c", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50972446-3d", "ovs_interfaceid": "50972446-3d7b-4bef-80f9-feac76fbba22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.597 187082 DEBUG nova.network.os_vif_util [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Converting VIF {"id": "50972446-3d7b-4bef-80f9-feac76fbba22", "address": "fa:16:3e:fd:64:5c", "network": {"id": "6ad71987-bbb3-4172-91ba-9872dff838b6", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-249428199-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee9e916301c41549b22b5a9425bdedd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50972446-3d", "ovs_interfaceid": "50972446-3d7b-4bef-80f9-feac76fbba22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.598 187082 DEBUG nova.network.os_vif_util [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:64:5c,bridge_name='br-int',has_traffic_filtering=True,id=50972446-3d7b-4bef-80f9-feac76fbba22,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50972446-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.598 187082 DEBUG os_vif [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:64:5c,bridge_name='br-int',has_traffic_filtering=True,id=50972446-3d7b-4bef-80f9-feac76fbba22,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50972446-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.600 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.601 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50972446-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.603 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.606 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:25:16 compute-1 sshd-session[211828]: Invalid user hacluster from 185.156.73.233 port 22812
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.608 187082 INFO os_vif [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:64:5c,bridge_name='br-int',has_traffic_filtering=True,id=50972446-3d7b-4bef-80f9-feac76fbba22,network=Network(6ad71987-bbb3-4172-91ba-9872dff838b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50972446-3d')
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.609 187082 INFO nova.virt.libvirt.driver [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Deleting instance files /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c_del
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.610 187082 INFO nova.virt.libvirt.driver [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Deletion of /var/lib/nova/instances/b4898324-b2cb-43d3-bf2d-f629af08e51c_del complete
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.669 187082 INFO nova.compute.manager [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.670 187082 DEBUG oslo.service.loopingcall [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.671 187082 DEBUG nova.compute.manager [-] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:25:16 compute-1 nova_compute[187078]: 2025-11-24 13:25:16.671 187082 DEBUG nova.network.neutron [-] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:25:16 compute-1 sshd-session[211828]: Connection closed by invalid user hacluster 185.156.73.233 port 22812 [preauth]
Nov 24 13:25:16 compute-1 neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6[211448]: [NOTICE]   (211452) : haproxy version is 2.8.14-c23fe91
Nov 24 13:25:16 compute-1 neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6[211448]: [NOTICE]   (211452) : path to executable is /usr/sbin/haproxy
Nov 24 13:25:16 compute-1 neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6[211448]: [WARNING]  (211452) : Exiting Master process...
Nov 24 13:25:16 compute-1 neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6[211448]: [WARNING]  (211452) : Exiting Master process...
Nov 24 13:25:16 compute-1 neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6[211448]: [ALERT]    (211452) : Current worker (211454) exited with code 143 (Terminated)
Nov 24 13:25:16 compute-1 neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6[211448]: [WARNING]  (211452) : All workers exited. Exiting... (0)
Nov 24 13:25:16 compute-1 systemd[1]: libpod-7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682.scope: Deactivated successfully.
Nov 24 13:25:16 compute-1 podman[211881]: 2025-11-24 13:25:16.798987119 +0000 UTC m=+0.248969327 container died 7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:25:17 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682-userdata-shm.mount: Deactivated successfully.
Nov 24 13:25:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-db1b27243cbb50d2001406d59b6b3af9d513ada61a0a653ef69d6c83a93f6b58-merged.mount: Deactivated successfully.
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.161 187082 DEBUG nova.compute.manager [req-84b312f7-9356-490d-9e6f-495754793de9 req-feb27cc1-996c-46df-8958-59030cc6a673 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Received event network-vif-unplugged-50972446-3d7b-4bef-80f9-feac76fbba22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.162 187082 DEBUG oslo_concurrency.lockutils [req-84b312f7-9356-490d-9e6f-495754793de9 req-feb27cc1-996c-46df-8958-59030cc6a673 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.162 187082 DEBUG oslo_concurrency.lockutils [req-84b312f7-9356-490d-9e6f-495754793de9 req-feb27cc1-996c-46df-8958-59030cc6a673 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.162 187082 DEBUG oslo_concurrency.lockutils [req-84b312f7-9356-490d-9e6f-495754793de9 req-feb27cc1-996c-46df-8958-59030cc6a673 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.162 187082 DEBUG nova.compute.manager [req-84b312f7-9356-490d-9e6f-495754793de9 req-feb27cc1-996c-46df-8958-59030cc6a673 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] No waiting events found dispatching network-vif-unplugged-50972446-3d7b-4bef-80f9-feac76fbba22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.163 187082 DEBUG nova.compute.manager [req-84b312f7-9356-490d-9e6f-495754793de9 req-feb27cc1-996c-46df-8958-59030cc6a673 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Received event network-vif-unplugged-50972446-3d7b-4bef-80f9-feac76fbba22 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.243 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.243 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.305 187082 DEBUG nova.compute.manager [req-72852d4f-4629-49d2-a8a6-15522ae4c27d req-f70c10e6-49f0-4186-8488-8a1ae5db4686 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received event network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.306 187082 DEBUG oslo_concurrency.lockutils [req-72852d4f-4629-49d2-a8a6-15522ae4c27d req-f70c10e6-49f0-4186-8488-8a1ae5db4686 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.306 187082 DEBUG oslo_concurrency.lockutils [req-72852d4f-4629-49d2-a8a6-15522ae4c27d req-f70c10e6-49f0-4186-8488-8a1ae5db4686 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.307 187082 DEBUG oslo_concurrency.lockutils [req-72852d4f-4629-49d2-a8a6-15522ae4c27d req-f70c10e6-49f0-4186-8488-8a1ae5db4686 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "c0bb309a-a2fc-4698-aff6-bfdabaaf0be0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.307 187082 DEBUG nova.compute.manager [req-72852d4f-4629-49d2-a8a6-15522ae4c27d req-f70c10e6-49f0-4186-8488-8a1ae5db4686 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] No waiting events found dispatching network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.308 187082 WARNING nova.compute.manager [req-72852d4f-4629-49d2-a8a6-15522ae4c27d req-f70c10e6-49f0-4186-8488-8a1ae5db4686 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received unexpected event network-vif-plugged-cc1251ff-b617-4a9f-a77b-a84ac3b48832 for instance with vm_state deleted and task_state None.
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.308 187082 DEBUG nova.compute.manager [req-72852d4f-4629-49d2-a8a6-15522ae4c27d req-f70c10e6-49f0-4186-8488-8a1ae5db4686 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Received event network-vif-deleted-cc1251ff-b617-4a9f-a77b-a84ac3b48832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:25:17 compute-1 podman[211881]: 2025-11-24 13:25:17.486749053 +0000 UTC m=+0.936731291 container cleanup 7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 13:25:17 compute-1 systemd[1]: libpod-conmon-7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682.scope: Deactivated successfully.
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.502 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:17 compute-1 podman[211924]: 2025-11-24 13:25:17.751690139 +0000 UTC m=+0.229983525 container remove 7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.761 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[7aeda733-5739-4214-95be-cd845f886d2c]: (4, ('Mon Nov 24 01:25:16 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6 (7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682)\n7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682\nMon Nov 24 01:25:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6 (7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682)\n7922124350bdd247fb4c469350533ec63c4910113b3f6faba9d1e537dd24c682\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.763 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d7dc10-7cb5-4a90-9f11-522e5c34b5c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.764 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ad71987-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.766 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:17 compute-1 kernel: tap6ad71987-b0: left promiscuous mode
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.778 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.779 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.783 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[453e98bd-7a9d-46e4-aded-11b72c3ede67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.804 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5330b475-32df-40ed-b6d4-2764312fda47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.806 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[43dc2bfd-156a-4432-ab4f-e70db8ccfeae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.831 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e58ab61e-596a-4e68-a8fc-1b536af53423]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368142, 'reachable_time': 28454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211940, 'error': None, 'target': 'ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.834 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6ad71987-bbb3-4172-91ba-9872dff838b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.835 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef8ecd7-06d3-4269-b7d3-3a31a5a72c28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:25:17 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:17.835 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:25:17 compute-1 systemd[1]: run-netns-ovnmeta\x2d6ad71987\x2dbbb3\x2d4172\x2d91ba\x2d9872dff838b6.mount: Deactivated successfully.
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.939 187082 DEBUG nova.network.neutron [-] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.957 187082 INFO nova.compute.manager [-] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Took 1.29 seconds to deallocate network for instance.
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.991 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:17 compute-1 nova_compute[187078]: 2025-11-24 13:25:17.992 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:18 compute-1 nova_compute[187078]: 2025-11-24 13:25:18.057 187082 DEBUG nova.compute.provider_tree [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:25:18 compute-1 nova_compute[187078]: 2025-11-24 13:25:18.075 187082 DEBUG nova.scheduler.client.report [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:25:18 compute-1 nova_compute[187078]: 2025-11-24 13:25:18.099 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:18 compute-1 nova_compute[187078]: 2025-11-24 13:25:18.130 187082 INFO nova.scheduler.client.report [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Deleted allocations for instance b4898324-b2cb-43d3-bf2d-f629af08e51c
Nov 24 13:25:18 compute-1 nova_compute[187078]: 2025-11-24 13:25:18.195 187082 DEBUG oslo_concurrency.lockutils [None req-5f959035-bf02-494e-8b2a-2e7a77a823d1 2d308744a09d4f178c98b8819242cf19 eee9e916301c41549b22b5a9425bdedd - - default default] Lock "b4898324-b2cb-43d3-bf2d-f629af08e51c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.890s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:19 compute-1 nova_compute[187078]: 2025-11-24 13:25:19.238 187082 DEBUG nova.compute.manager [req-26ee30de-b049-4ec2-ad19-baf258fe4500 req-683ce6c5-cdd9-4c1c-a7e2-3788b12705db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Received event network-vif-plugged-50972446-3d7b-4bef-80f9-feac76fbba22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:25:19 compute-1 nova_compute[187078]: 2025-11-24 13:25:19.239 187082 DEBUG oslo_concurrency.lockutils [req-26ee30de-b049-4ec2-ad19-baf258fe4500 req-683ce6c5-cdd9-4c1c-a7e2-3788b12705db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:25:19 compute-1 nova_compute[187078]: 2025-11-24 13:25:19.240 187082 DEBUG oslo_concurrency.lockutils [req-26ee30de-b049-4ec2-ad19-baf258fe4500 req-683ce6c5-cdd9-4c1c-a7e2-3788b12705db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:25:19 compute-1 nova_compute[187078]: 2025-11-24 13:25:19.240 187082 DEBUG oslo_concurrency.lockutils [req-26ee30de-b049-4ec2-ad19-baf258fe4500 req-683ce6c5-cdd9-4c1c-a7e2-3788b12705db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "b4898324-b2cb-43d3-bf2d-f629af08e51c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:25:19 compute-1 nova_compute[187078]: 2025-11-24 13:25:19.241 187082 DEBUG nova.compute.manager [req-26ee30de-b049-4ec2-ad19-baf258fe4500 req-683ce6c5-cdd9-4c1c-a7e2-3788b12705db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] No waiting events found dispatching network-vif-plugged-50972446-3d7b-4bef-80f9-feac76fbba22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:25:19 compute-1 nova_compute[187078]: 2025-11-24 13:25:19.241 187082 WARNING nova.compute.manager [req-26ee30de-b049-4ec2-ad19-baf258fe4500 req-683ce6c5-cdd9-4c1c-a7e2-3788b12705db 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Received unexpected event network-vif-plugged-50972446-3d7b-4bef-80f9-feac76fbba22 for instance with vm_state deleted and task_state None.
Nov 24 13:25:19 compute-1 nova_compute[187078]: 2025-11-24 13:25:19.387 187082 DEBUG nova.compute.manager [req-a3b43afa-137f-4630-9291-ddcbe359b6c2 req-4f5bd4c5-1cbc-4e61-913c-bcbf46668d6e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Received event network-vif-deleted-50972446-3d7b-4bef-80f9-feac76fbba22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:25:19 compute-1 openstack_network_exporter[199599]: ERROR   13:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:25:19 compute-1 openstack_network_exporter[199599]: ERROR   13:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:25:19 compute-1 openstack_network_exporter[199599]: ERROR   13:25:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:25:19 compute-1 openstack_network_exporter[199599]: ERROR   13:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:25:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:25:19 compute-1 openstack_network_exporter[199599]: ERROR   13:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:25:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:25:19 compute-1 sshd-session[211941]: Received disconnect from 5.198.176.28 port 44126:11: Bye Bye [preauth]
Nov 24 13:25:19 compute-1 sshd-session[211941]: Disconnected from authenticating user root 5.198.176.28 port 44126 [preauth]
Nov 24 13:25:20 compute-1 nova_compute[187078]: 2025-11-24 13:25:20.676 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:25:20 compute-1 nova_compute[187078]: 2025-11-24 13:25:20.676 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:25:20 compute-1 nova_compute[187078]: 2025-11-24 13:25:20.689 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:25:21 compute-1 nova_compute[187078]: 2025-11-24 13:25:21.662 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:21 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:25:21.838 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:25:22 compute-1 nova_compute[187078]: 2025-11-24 13:25:22.504 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:26 compute-1 nova_compute[187078]: 2025-11-24 13:25:26.664 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:27 compute-1 nova_compute[187078]: 2025-11-24 13:25:27.135 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763990712.1336408, c0bb309a-a2fc-4698-aff6-bfdabaaf0be0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:25:27 compute-1 nova_compute[187078]: 2025-11-24 13:25:27.135 187082 INFO nova.compute.manager [-] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] VM Stopped (Lifecycle Event)
Nov 24 13:25:27 compute-1 nova_compute[187078]: 2025-11-24 13:25:27.153 187082 DEBUG nova.compute.manager [None req-ca8574e9-b0d3-4e20-b6dc-a6287b781c06 - - - - - -] [instance: c0bb309a-a2fc-4698-aff6-bfdabaaf0be0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:25:27 compute-1 nova_compute[187078]: 2025-11-24 13:25:27.507 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:31 compute-1 podman[211943]: 2025-11-24 13:25:31.520294119 +0000 UTC m=+0.061638922 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:25:31 compute-1 podman[211944]: 2025-11-24 13:25:31.551671455 +0000 UTC m=+0.087460347 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:25:31 compute-1 nova_compute[187078]: 2025-11-24 13:25:31.578 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763990716.5762362, b4898324-b2cb-43d3-bf2d-f629af08e51c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:25:31 compute-1 nova_compute[187078]: 2025-11-24 13:25:31.579 187082 INFO nova.compute.manager [-] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] VM Stopped (Lifecycle Event)
Nov 24 13:25:31 compute-1 nova_compute[187078]: 2025-11-24 13:25:31.596 187082 DEBUG nova.compute.manager [None req-84c220d7-611c-4b87-834a-5664068a3a26 - - - - - -] [instance: b4898324-b2cb-43d3-bf2d-f629af08e51c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:25:31 compute-1 nova_compute[187078]: 2025-11-24 13:25:31.667 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:32 compute-1 nova_compute[187078]: 2025-11-24 13:25:32.512 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:35 compute-1 podman[197429]: time="2025-11-24T13:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:25:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:25:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Nov 24 13:25:36 compute-1 nova_compute[187078]: 2025-11-24 13:25:36.701 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:37 compute-1 nova_compute[187078]: 2025-11-24 13:25:37.516 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:38 compute-1 podman[211989]: 2025-11-24 13:25:38.521096269 +0000 UTC m=+0.064383537 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:25:38 compute-1 podman[211990]: 2025-11-24 13:25:38.572686636 +0000 UTC m=+0.099774583 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 13:25:38 compute-1 sshd-session[211987]: Invalid user bot from 176.114.89.34 port 45942
Nov 24 13:25:38 compute-1 sshd-session[211987]: Received disconnect from 176.114.89.34 port 45942:11: Bye Bye [preauth]
Nov 24 13:25:38 compute-1 sshd-session[211987]: Disconnected from invalid user bot 176.114.89.34 port 45942 [preauth]
Nov 24 13:25:39 compute-1 nova_compute[187078]: 2025-11-24 13:25:39.425 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:41 compute-1 nova_compute[187078]: 2025-11-24 13:25:41.704 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:42 compute-1 nova_compute[187078]: 2025-11-24 13:25:42.519 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:45 compute-1 sshd-session[212036]: Received disconnect from 85.209.134.43 port 55068:11: Bye Bye [preauth]
Nov 24 13:25:45 compute-1 sshd-session[212036]: Disconnected from authenticating user root 85.209.134.43 port 55068 [preauth]
Nov 24 13:25:46 compute-1 podman[212038]: 2025-11-24 13:25:46.53188955 +0000 UTC m=+0.073122206 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Nov 24 13:25:46 compute-1 nova_compute[187078]: 2025-11-24 13:25:46.708 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:47 compute-1 nova_compute[187078]: 2025-11-24 13:25:47.522 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:49 compute-1 openstack_network_exporter[199599]: ERROR   13:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:25:49 compute-1 openstack_network_exporter[199599]: ERROR   13:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:25:49 compute-1 openstack_network_exporter[199599]: ERROR   13:25:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:25:49 compute-1 openstack_network_exporter[199599]: ERROR   13:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:25:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:25:49 compute-1 openstack_network_exporter[199599]: ERROR   13:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:25:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:25:51 compute-1 nova_compute[187078]: 2025-11-24 13:25:51.730 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:52 compute-1 sshd-session[212060]: Connection closed by 193.32.162.146 port 44810
Nov 24 13:25:52 compute-1 nova_compute[187078]: 2025-11-24 13:25:52.524 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:56 compute-1 nova_compute[187078]: 2025-11-24 13:25:56.734 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:25:57 compute-1 nova_compute[187078]: 2025-11-24 13:25:57.526 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:01 compute-1 nova_compute[187078]: 2025-11-24 13:26:01.741 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:02 compute-1 nova_compute[187078]: 2025-11-24 13:26:02.529 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:02 compute-1 podman[212062]: 2025-11-24 13:26:02.556659858 +0000 UTC m=+0.084055314 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 24 13:26:02 compute-1 podman[212061]: 2025-11-24 13:26:02.558830927 +0000 UTC m=+0.099767332 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:26:03 compute-1 nova_compute[187078]: 2025-11-24 13:26:03.681 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:26:04.154 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:26:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:26:04.154 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:26:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:26:04.155 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:26:05 compute-1 podman[197429]: time="2025-11-24T13:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:26:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:26:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Nov 24 13:26:06 compute-1 nova_compute[187078]: 2025-11-24 13:26:06.753 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:07 compute-1 nova_compute[187078]: 2025-11-24 13:26:07.531 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:09 compute-1 podman[212106]: 2025-11-24 13:26:09.522353951 +0000 UTC m=+0.064020828 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 24 13:26:09 compute-1 podman[212107]: 2025-11-24 13:26:09.56153277 +0000 UTC m=+0.094943391 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:26:09 compute-1 nova_compute[187078]: 2025-11-24 13:26:09.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:09 compute-1 nova_compute[187078]: 2025-11-24 13:26:09.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:10 compute-1 nova_compute[187078]: 2025-11-24 13:26:10.661 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.693 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.756 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.890 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.891 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5877MB free_disk=73.46195983886719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.891 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.891 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.974 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:26:11 compute-1 nova_compute[187078]: 2025-11-24 13:26:11.974 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:26:12 compute-1 nova_compute[187078]: 2025-11-24 13:26:12.005 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:26:12 compute-1 nova_compute[187078]: 2025-11-24 13:26:12.028 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:26:12 compute-1 nova_compute[187078]: 2025-11-24 13:26:12.055 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:26:12 compute-1 nova_compute[187078]: 2025-11-24 13:26:12.056 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:26:12 compute-1 nova_compute[187078]: 2025-11-24 13:26:12.535 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:13 compute-1 nova_compute[187078]: 2025-11-24 13:26:13.057 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:13 compute-1 nova_compute[187078]: 2025-11-24 13:26:13.058 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:26:13 compute-1 nova_compute[187078]: 2025-11-24 13:26:13.058 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:26:13 compute-1 nova_compute[187078]: 2025-11-24 13:26:13.075 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:26:13 compute-1 nova_compute[187078]: 2025-11-24 13:26:13.076 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:13 compute-1 nova_compute[187078]: 2025-11-24 13:26:13.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:13 compute-1 nova_compute[187078]: 2025-11-24 13:26:13.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:13 compute-1 nova_compute[187078]: 2025-11-24 13:26:13.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:26:14 compute-1 nova_compute[187078]: 2025-11-24 13:26:14.661 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:26:14 compute-1 ovn_controller[95368]: 2025-11-24T13:26:14Z|00113|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 24 13:26:16 compute-1 nova_compute[187078]: 2025-11-24 13:26:16.760 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:17 compute-1 podman[212157]: 2025-11-24 13:26:17.529605285 +0000 UTC m=+0.073134756 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Nov 24 13:26:17 compute-1 nova_compute[187078]: 2025-11-24 13:26:17.537 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:18 compute-1 sshd-session[212155]: Invalid user fan from 175.100.24.139 port 56650
Nov 24 13:26:18 compute-1 sshd-session[212155]: Received disconnect from 175.100.24.139 port 56650:11: Bye Bye [preauth]
Nov 24 13:26:18 compute-1 sshd-session[212155]: Disconnected from invalid user fan 175.100.24.139 port 56650 [preauth]
Nov 24 13:26:19 compute-1 openstack_network_exporter[199599]: ERROR   13:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:26:19 compute-1 openstack_network_exporter[199599]: ERROR   13:26:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:26:19 compute-1 openstack_network_exporter[199599]: ERROR   13:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:26:19 compute-1 openstack_network_exporter[199599]: ERROR   13:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:26:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:26:19 compute-1 openstack_network_exporter[199599]: ERROR   13:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:26:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:26:21 compute-1 nova_compute[187078]: 2025-11-24 13:26:21.762 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:22 compute-1 nova_compute[187078]: 2025-11-24 13:26:22.541 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:24 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:26:24.422 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:26:24 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:26:24.424 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:26:24 compute-1 nova_compute[187078]: 2025-11-24 13:26:24.423 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:24 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:26:24.424 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:26:26 compute-1 nova_compute[187078]: 2025-11-24 13:26:26.765 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:27 compute-1 nova_compute[187078]: 2025-11-24 13:26:27.546 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:30 compute-1 sshd-session[212178]: Received disconnect from 5.198.176.28 port 44230:11: Bye Bye [preauth]
Nov 24 13:26:30 compute-1 sshd-session[212178]: Disconnected from authenticating user root 5.198.176.28 port 44230 [preauth]
Nov 24 13:26:31 compute-1 nova_compute[187078]: 2025-11-24 13:26:31.768 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:32 compute-1 sshd-session[212180]: Invalid user train1 from 68.183.82.237 port 54932
Nov 24 13:26:32 compute-1 nova_compute[187078]: 2025-11-24 13:26:32.549 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:32 compute-1 sshd-session[212180]: Received disconnect from 68.183.82.237 port 54932:11: Bye Bye [preauth]
Nov 24 13:26:32 compute-1 sshd-session[212180]: Disconnected from invalid user train1 68.183.82.237 port 54932 [preauth]
Nov 24 13:26:33 compute-1 podman[212184]: 2025-11-24 13:26:33.528762515 +0000 UTC m=+0.065593602 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:26:33 compute-1 podman[212183]: 2025-11-24 13:26:33.546058896 +0000 UTC m=+0.081262889 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:26:35 compute-1 podman[197429]: time="2025-11-24T13:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:26:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:26:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Nov 24 13:26:36 compute-1 nova_compute[187078]: 2025-11-24 13:26:36.819 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:37 compute-1 nova_compute[187078]: 2025-11-24 13:26:37.551 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:40 compute-1 podman[212226]: 2025-11-24 13:26:40.520881508 +0000 UTC m=+0.064293446 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:26:40 compute-1 podman[212227]: 2025-11-24 13:26:40.582708275 +0000 UTC m=+0.112020968 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 24 13:26:41 compute-1 nova_compute[187078]: 2025-11-24 13:26:41.822 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:42 compute-1 nova_compute[187078]: 2025-11-24 13:26:42.553 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:45 compute-1 sshd-session[212273]: Invalid user admin from 85.209.134.43 port 47946
Nov 24 13:26:45 compute-1 sshd-session[212273]: Received disconnect from 85.209.134.43 port 47946:11: Bye Bye [preauth]
Nov 24 13:26:45 compute-1 sshd-session[212273]: Disconnected from invalid user admin 85.209.134.43 port 47946 [preauth]
Nov 24 13:26:46 compute-1 nova_compute[187078]: 2025-11-24 13:26:46.824 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:47 compute-1 sshd-session[212275]: Invalid user cgpexpert from 176.114.89.34 port 38200
Nov 24 13:26:47 compute-1 nova_compute[187078]: 2025-11-24 13:26:47.555 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:47 compute-1 sshd-session[212275]: Received disconnect from 176.114.89.34 port 38200:11: Bye Bye [preauth]
Nov 24 13:26:47 compute-1 sshd-session[212275]: Disconnected from invalid user cgpexpert 176.114.89.34 port 38200 [preauth]
Nov 24 13:26:48 compute-1 podman[212279]: 2025-11-24 13:26:48.524664229 +0000 UTC m=+0.066221569 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=)
Nov 24 13:26:49 compute-1 openstack_network_exporter[199599]: ERROR   13:26:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:26:49 compute-1 openstack_network_exporter[199599]: ERROR   13:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:26:49 compute-1 openstack_network_exporter[199599]: ERROR   13:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:26:49 compute-1 openstack_network_exporter[199599]: ERROR   13:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:26:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:26:49 compute-1 openstack_network_exporter[199599]: ERROR   13:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:26:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:26:50 compute-1 sshd-session[212277]: Invalid user vtatis from 45.78.194.40 port 38330
Nov 24 13:26:51 compute-1 sshd-session[212277]: Received disconnect from 45.78.194.40 port 38330:11: Bye Bye [preauth]
Nov 24 13:26:51 compute-1 sshd-session[212277]: Disconnected from invalid user vtatis 45.78.194.40 port 38330 [preauth]
Nov 24 13:26:51 compute-1 nova_compute[187078]: 2025-11-24 13:26:51.826 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:52 compute-1 nova_compute[187078]: 2025-11-24 13:26:52.591 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:56 compute-1 nova_compute[187078]: 2025-11-24 13:26:56.829 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:57 compute-1 nova_compute[187078]: 2025-11-24 13:26:57.593 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:26:59 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 24 13:27:01 compute-1 nova_compute[187078]: 2025-11-24 13:27:01.831 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:02 compute-1 nova_compute[187078]: 2025-11-24 13:27:02.594 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:04.154 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:04.156 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:04.156 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:04 compute-1 podman[212303]: 2025-11-24 13:27:04.549702843 +0000 UTC m=+0.077899436 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 24 13:27:04 compute-1 podman[212302]: 2025-11-24 13:27:04.560464607 +0000 UTC m=+0.090118419 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.385 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.386 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.406 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.504 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.505 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.510 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.510 187082 INFO nova.compute.claims [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.602 187082 DEBUG nova.compute.provider_tree [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.617 187082 DEBUG nova.scheduler.client.report [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.635 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.636 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:27:05 compute-1 podman[197429]: time="2025-11-24T13:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:27:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:27:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.669 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.673 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.673 187082 DEBUG nova.network.neutron [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.686 187082 INFO nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.699 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.776 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.778 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.778 187082 INFO nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Creating image(s)
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.779 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "/var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.780 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.781 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.807 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.846 187082 DEBUG nova.policy [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44609a4d2fa941a4b26d6b27a5d4a6d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a66bcdc071b741ef8709a4608acd6051', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.903 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.904 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.904 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.915 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.987 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:05 compute-1 nova_compute[187078]: 2025-11-24 13:27:05.988 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.024 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.025 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.026 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.084 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.085 187082 DEBUG nova.virt.disk.api [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Checking if we can resize image /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.085 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.149 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.150 187082 DEBUG nova.virt.disk.api [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Cannot resize image /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.151 187082 DEBUG nova.objects.instance [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'migration_context' on Instance uuid 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.168 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.168 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Ensure instance console log exists: /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.169 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.169 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.169 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.532 187082 DEBUG nova.network.neutron [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Successfully created port: e3a951ff-085f-4216-98f1-ff98fc0869ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:27:06 compute-1 nova_compute[187078]: 2025-11-24 13:27:06.834 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.270 187082 DEBUG nova.network.neutron [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Successfully updated port: e3a951ff-085f-4216-98f1-ff98fc0869ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.291 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.291 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquired lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.291 187082 DEBUG nova.network.neutron [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.374 187082 DEBUG nova.compute.manager [req-ebe20477-baee-4803-b080-3c867048dee1 req-f550d4f6-b9a4-45e1-8658-86e503ffc073 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-changed-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.377 187082 DEBUG nova.compute.manager [req-ebe20477-baee-4803-b080-3c867048dee1 req-f550d4f6-b9a4-45e1-8658-86e503ffc073 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Refreshing instance network info cache due to event network-changed-e3a951ff-085f-4216-98f1-ff98fc0869ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.377 187082 DEBUG oslo_concurrency.lockutils [req-ebe20477-baee-4803-b080-3c867048dee1 req-f550d4f6-b9a4-45e1-8658-86e503ffc073 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.463 187082 DEBUG nova.network.neutron [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:27:07 compute-1 nova_compute[187078]: 2025-11-24 13:27:07.595 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.210 187082 DEBUG nova.network.neutron [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Updating instance_info_cache with network_info: [{"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.234 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Releasing lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.234 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Instance network_info: |[{"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.235 187082 DEBUG oslo_concurrency.lockutils [req-ebe20477-baee-4803-b080-3c867048dee1 req-f550d4f6-b9a4-45e1-8658-86e503ffc073 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.235 187082 DEBUG nova.network.neutron [req-ebe20477-baee-4803-b080-3c867048dee1 req-f550d4f6-b9a4-45e1-8658-86e503ffc073 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Refreshing network info cache for port e3a951ff-085f-4216-98f1-ff98fc0869ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.239 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Start _get_guest_xml network_info=[{"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.246 187082 WARNING nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.252 187082 DEBUG nova.virt.libvirt.host [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.253 187082 DEBUG nova.virt.libvirt.host [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.262 187082 DEBUG nova.virt.libvirt.host [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.262 187082 DEBUG nova.virt.libvirt.host [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.264 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.264 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.265 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.265 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.266 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.266 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.266 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.267 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.267 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.267 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.267 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.268 187082 DEBUG nova.virt.hardware [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.273 187082 DEBUG nova.virt.libvirt.vif [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:27:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-172964129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-172964129',id=12,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-a9a033rx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:27:05Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=42898dd6-ca48-4a89-9d80-50ac0c9f5b0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.273 187082 DEBUG nova.network.os_vif_util [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.274 187082 DEBUG nova.network.os_vif_util [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:7a:8d,bridge_name='br-int',has_traffic_filtering=True,id=e3a951ff-085f-4216-98f1-ff98fc0869ad,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3a951ff-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.276 187082 DEBUG nova.objects.instance [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'pci_devices' on Instance uuid 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.291 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <uuid>42898dd6-ca48-4a89-9d80-50ac0c9f5b0d</uuid>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <name>instance-0000000c</name>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteStrategies-server-172964129</nova:name>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:27:08</nova:creationTime>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:27:08 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:27:08 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:27:08 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:27:08 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:27:08 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:27:08 compute-1 nova_compute[187078]:         <nova:user uuid="44609a4d2fa941a4b26d6b27a5d4a6d2">tempest-TestExecuteStrategies-392394962-project-member</nova:user>
Nov 24 13:27:08 compute-1 nova_compute[187078]:         <nova:project uuid="a66bcdc071b741ef8709a4608acd6051">tempest-TestExecuteStrategies-392394962</nova:project>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:27:08 compute-1 nova_compute[187078]:         <nova:port uuid="e3a951ff-085f-4216-98f1-ff98fc0869ad">
Nov 24 13:27:08 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <system>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <entry name="serial">42898dd6-ca48-4a89-9d80-50ac0c9f5b0d</entry>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <entry name="uuid">42898dd6-ca48-4a89-9d80-50ac0c9f5b0d</entry>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </system>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <os>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   </os>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <features>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   </features>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk.config"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:43:7a:8d"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <target dev="tape3a951ff-08"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/console.log" append="off"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <video>
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </video>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:27:08 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:27:08 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:27:08 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:27:08 compute-1 nova_compute[187078]: </domain>
Nov 24 13:27:08 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.292 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Preparing to wait for external event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.293 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.293 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.293 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.294 187082 DEBUG nova.virt.libvirt.vif [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:27:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-172964129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-172964129',id=12,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-a9a033rx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:27:05Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=42898dd6-ca48-4a89-9d80-50ac0c9f5b0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.295 187082 DEBUG nova.network.os_vif_util [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.295 187082 DEBUG nova.network.os_vif_util [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:7a:8d,bridge_name='br-int',has_traffic_filtering=True,id=e3a951ff-085f-4216-98f1-ff98fc0869ad,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3a951ff-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.296 187082 DEBUG os_vif [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:7a:8d,bridge_name='br-int',has_traffic_filtering=True,id=e3a951ff-085f-4216-98f1-ff98fc0869ad,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3a951ff-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.297 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.298 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.298 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.303 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.304 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3a951ff-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.304 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape3a951ff-08, col_values=(('external_ids', {'iface-id': 'e3a951ff-085f-4216-98f1-ff98fc0869ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:7a:8d', 'vm-uuid': '42898dd6-ca48-4a89-9d80-50ac0c9f5b0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.306 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:08 compute-1 NetworkManager[55527]: <info>  [1763990828.3081] manager: (tape3a951ff-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.310 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.317 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.318 187082 INFO os_vif [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:7a:8d,bridge_name='br-int',has_traffic_filtering=True,id=e3a951ff-085f-4216-98f1-ff98fc0869ad,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3a951ff-08')
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.368 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.369 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.369 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No VIF found with MAC fa:16:3e:43:7a:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.370 187082 INFO nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Using config drive
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.634 187082 INFO nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Creating config drive at /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk.config
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.640 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp34pxclc0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.777 187082 DEBUG oslo_concurrency.processutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp34pxclc0" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:08 compute-1 kernel: tape3a951ff-08: entered promiscuous mode
Nov 24 13:27:08 compute-1 NetworkManager[55527]: <info>  [1763990828.8352] manager: (tape3a951ff-08): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Nov 24 13:27:08 compute-1 systemd-udevd[212376]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.893 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:08 compute-1 ovn_controller[95368]: 2025-11-24T13:27:08Z|00114|binding|INFO|Claiming lport e3a951ff-085f-4216-98f1-ff98fc0869ad for this chassis.
Nov 24 13:27:08 compute-1 ovn_controller[95368]: 2025-11-24T13:27:08Z|00115|binding|INFO|e3a951ff-085f-4216-98f1-ff98fc0869ad: Claiming fa:16:3e:43:7a:8d 10.100.0.6
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.896 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.907 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:7a:8d 10.100.0.6'], port_security=['fa:16:3e:43:7a:8d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '42898dd6-ca48-4a89-9d80-50ac0c9f5b0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=e3a951ff-085f-4216-98f1-ff98fc0869ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.909 104225 INFO neutron.agent.ovn.metadata.agent [-] Port e3a951ff-085f-4216-98f1-ff98fc0869ad in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.911 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:27:08 compute-1 NetworkManager[55527]: <info>  [1763990828.9126] device (tape3a951ff-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:27:08 compute-1 NetworkManager[55527]: <info>  [1763990828.9136] device (tape3a951ff-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.923 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[99c7627e-ba92-47f2-ae9c-4600bbed771a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.924 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee6bf4e1-a1 in ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:27:08 compute-1 systemd-machined[153355]: New machine qemu-9-instance-0000000c.
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.926 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee6bf4e1-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.926 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[459f15c1-f161-4f19-836a-03c55e386e82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.927 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[cde1459d-8d62-4073-a6f0-b6cade7128a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.938 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbed3df-dfde-4687-9c79-dd13b73391b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:08 compute-1 ovn_controller[95368]: 2025-11-24T13:27:08Z|00116|binding|INFO|Setting lport e3a951ff-085f-4216-98f1-ff98fc0869ad ovn-installed in OVS
Nov 24 13:27:08 compute-1 ovn_controller[95368]: 2025-11-24T13:27:08Z|00117|binding|INFO|Setting lport e3a951ff-085f-4216-98f1-ff98fc0869ad up in Southbound
Nov 24 13:27:08 compute-1 nova_compute[187078]: 2025-11-24 13:27:08.953 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:08 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-0000000c.
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.962 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5ed122-ab32-4543-b773-15060edc5022]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.990 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[e178e8a1-4774-4eba-adf8-c9b3a7a72776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:08.995 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e14e74f1-bba7-4032-a741-074f18a01c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:08 compute-1 NetworkManager[55527]: <info>  [1763990828.9964] manager: (tapee6bf4e1-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/48)
Nov 24 13:27:08 compute-1 systemd-udevd[212380]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.026 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[e9dd3f6f-f328-45d6-a8b4-16ca7ea03a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.029 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[02c160a0-16b0-4dd9-8b90-96300057a7f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 NetworkManager[55527]: <info>  [1763990829.0531] device (tapee6bf4e1-a0): carrier: link connected
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.058 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb7b1d4-b2ed-4772-b86d-dbca35e756cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.078 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8021680c-7457-407d-886c-75b201dd1752]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384771, 'reachable_time': 18971, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212412, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.094 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[98c78009-7379-40fd-ba0f-5a4eb1a31828]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:5bc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384771, 'tstamp': 384771}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212413, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.113 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[53f61273-b34e-49e6-b190-7a81a7ccc7a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384771, 'reachable_time': 18971, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212414, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.143 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff52ebb-d02b-4d3c-8f75-d8c21feab487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.199 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a0608451-4f0b-400a-9b56-35cff699ce60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.200 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.200 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.201 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.202 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:09 compute-1 NetworkManager[55527]: <info>  [1763990829.2036] manager: (tapee6bf4e1-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Nov 24 13:27:09 compute-1 kernel: tapee6bf4e1-a0: entered promiscuous mode
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.205 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.208 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.209 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:09 compute-1 ovn_controller[95368]: 2025-11-24T13:27:09Z|00118|binding|INFO|Releasing lport 3f7bb31c-e9f4-4c4a-ad4a-8451f233926d from this chassis (sb_readonly=0)
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.210 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.219 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[fab41074-5151-4b5c-b327-ca471a93bf03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.221 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.221 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:27:09 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:09.222 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'env', 'PROCESS_TAG=haproxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.421 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990829.4204388, 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.422 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] VM Started (Lifecycle Event)
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.441 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.446 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990829.421712, 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.446 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] VM Paused (Lifecycle Event)
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.464 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.469 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.473 187082 DEBUG nova.compute.manager [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.474 187082 DEBUG oslo_concurrency.lockutils [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.474 187082 DEBUG oslo_concurrency.lockutils [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.474 187082 DEBUG oslo_concurrency.lockutils [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.475 187082 DEBUG nova.compute.manager [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Processing event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.475 187082 DEBUG nova.compute.manager [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.475 187082 DEBUG oslo_concurrency.lockutils [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.476 187082 DEBUG oslo_concurrency.lockutils [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.476 187082 DEBUG oslo_concurrency.lockutils [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.476 187082 DEBUG nova.compute.manager [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] No waiting events found dispatching network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.476 187082 WARNING nova.compute.manager [req-260457ad-527c-4f8c-bb79-a1a324e5d315 req-540556a3-5d7f-40b1-aee5-85f750b2bafc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received unexpected event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad for instance with vm_state building and task_state spawning.
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.477 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.483 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.488 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.489 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990829.4818285, 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.489 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] VM Resumed (Lifecycle Event)
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.493 187082 INFO nova.virt.libvirt.driver [-] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Instance spawned successfully.
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.493 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.505 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.511 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.515 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.516 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.516 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.517 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.517 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.518 187082 DEBUG nova.virt.libvirt.driver [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.538 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.569 187082 INFO nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Took 3.79 seconds to spawn the instance on the hypervisor.
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.570 187082 DEBUG nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.605 187082 DEBUG nova.network.neutron [req-ebe20477-baee-4803-b080-3c867048dee1 req-f550d4f6-b9a4-45e1-8658-86e503ffc073 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Updated VIF entry in instance network info cache for port e3a951ff-085f-4216-98f1-ff98fc0869ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.606 187082 DEBUG nova.network.neutron [req-ebe20477-baee-4803-b080-3c867048dee1 req-f550d4f6-b9a4-45e1-8658-86e503ffc073 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Updating instance_info_cache with network_info: [{"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.618 187082 DEBUG oslo_concurrency.lockutils [req-ebe20477-baee-4803-b080-3c867048dee1 req-f550d4f6-b9a4-45e1-8658-86e503ffc073 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.677 187082 INFO nova.compute.manager [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Took 4.20 seconds to build instance.
Nov 24 13:27:09 compute-1 nova_compute[187078]: 2025-11-24 13:27:09.690 187082 DEBUG oslo_concurrency.lockutils [None req-00b263f6-b3b0-4de9-a50a-bb129e80c8dd 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:09 compute-1 podman[212452]: 2025-11-24 13:27:09.729308285 +0000 UTC m=+0.079755337 container create b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 13:27:09 compute-1 systemd[1]: Started libpod-conmon-b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d.scope.
Nov 24 13:27:09 compute-1 podman[212452]: 2025-11-24 13:27:09.691361489 +0000 UTC m=+0.041808531 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:27:09 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:27:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a95dd72ef7ddad9ff3ee680468e4403f3a0f573e817d7aacea379c20273f3c06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:27:09 compute-1 podman[212452]: 2025-11-24 13:27:09.82150271 +0000 UTC m=+0.171949772 container init b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:27:09 compute-1 podman[212452]: 2025-11-24 13:27:09.829393205 +0000 UTC m=+0.179840247 container start b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 24 13:27:09 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[212465]: [NOTICE]   (212469) : New worker (212471) forked
Nov 24 13:27:09 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[212465]: [NOTICE]   (212469) : Loading success.
Nov 24 13:27:10 compute-1 nova_compute[187078]: 2025-11-24 13:27:10.672 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:11 compute-1 podman[212481]: 2025-11-24 13:27:11.557897686 +0000 UTC m=+0.101781328 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:27:11 compute-1 podman[212480]: 2025-11-24 13:27:11.565494222 +0000 UTC m=+0.112863049 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.684 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.684 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.684 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.684 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.748 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.807 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.811 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:11 compute-1 nova_compute[187078]: 2025-11-24 13:27:11.902 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.104 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.106 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5720MB free_disk=73.46109390258789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.106 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.107 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.220 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.221 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.221 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.268 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.278 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.298 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.299 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:12 compute-1 nova_compute[187078]: 2025-11-24 13:27:12.650 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:13 compute-1 sshd-session[212533]: Invalid user sol from 45.148.10.240 port 46934
Nov 24 13:27:13 compute-1 sshd-session[212533]: Connection closed by invalid user sol 45.148.10.240 port 46934 [preauth]
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.294 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.294 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.295 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.295 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.307 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.480 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.481 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.481 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:27:13 compute-1 nova_compute[187078]: 2025-11-24 13:27:13.481 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:27:13 compute-1 sshd-session[212535]: Invalid user degen from 193.32.162.145 port 54890
Nov 24 13:27:13 compute-1 sshd-session[212535]: Connection closed by invalid user degen 193.32.162.145 port 54890 [preauth]
Nov 24 13:27:15 compute-1 nova_compute[187078]: 2025-11-24 13:27:15.273 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Updating instance_info_cache with network_info: [{"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:27:15 compute-1 nova_compute[187078]: 2025-11-24 13:27:15.295 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:27:15 compute-1 nova_compute[187078]: 2025-11-24 13:27:15.295 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:27:15 compute-1 nova_compute[187078]: 2025-11-24 13:27:15.296 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:15 compute-1 nova_compute[187078]: 2025-11-24 13:27:15.296 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:15 compute-1 nova_compute[187078]: 2025-11-24 13:27:15.296 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:27:15 compute-1 nova_compute[187078]: 2025-11-24 13:27:15.296 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:27:17 compute-1 nova_compute[187078]: 2025-11-24 13:27:17.693 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:18 compute-1 nova_compute[187078]: 2025-11-24 13:27:18.310 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:19 compute-1 openstack_network_exporter[199599]: ERROR   13:27:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:27:19 compute-1 openstack_network_exporter[199599]: ERROR   13:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:27:19 compute-1 openstack_network_exporter[199599]: ERROR   13:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:27:19 compute-1 openstack_network_exporter[199599]: ERROR   13:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:27:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:27:19 compute-1 openstack_network_exporter[199599]: ERROR   13:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:27:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:27:19 compute-1 podman[212537]: 2025-11-24 13:27:19.53179231 +0000 UTC m=+0.071505271 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public)
Nov 24 13:27:21 compute-1 ovn_controller[95368]: 2025-11-24T13:27:21Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:43:7a:8d 10.100.0.6
Nov 24 13:27:21 compute-1 ovn_controller[95368]: 2025-11-24T13:27:21Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:7a:8d 10.100.0.6
Nov 24 13:27:22 compute-1 nova_compute[187078]: 2025-11-24 13:27:22.694 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:23 compute-1 nova_compute[187078]: 2025-11-24 13:27:23.312 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:27 compute-1 nova_compute[187078]: 2025-11-24 13:27:27.758 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:28 compute-1 nova_compute[187078]: 2025-11-24 13:27:28.314 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:32 compute-1 nova_compute[187078]: 2025-11-24 13:27:32.811 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:33 compute-1 nova_compute[187078]: 2025-11-24 13:27:33.316 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:35 compute-1 podman[212566]: 2025-11-24 13:27:35.517119152 +0000 UTC m=+0.057069848 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:27:35 compute-1 podman[212567]: 2025-11-24 13:27:35.517109761 +0000 UTC m=+0.050712484 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:27:35 compute-1 podman[197429]: time="2025-11-24T13:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:27:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:27:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Nov 24 13:27:37 compute-1 sshd-session[212606]: Received disconnect from 5.198.176.28 port 44338:11: Bye Bye [preauth]
Nov 24 13:27:37 compute-1 sshd-session[212606]: Disconnected from authenticating user root 5.198.176.28 port 44338 [preauth]
Nov 24 13:27:37 compute-1 nova_compute[187078]: 2025-11-24 13:27:37.867 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:38 compute-1 nova_compute[187078]: 2025-11-24 13:27:38.318 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:39 compute-1 ovn_controller[95368]: 2025-11-24T13:27:39Z|00119|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Nov 24 13:27:42 compute-1 podman[212608]: 2025-11-24 13:27:42.531771906 +0000 UTC m=+0.068520889 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 24 13:27:42 compute-1 podman[212609]: 2025-11-24 13:27:42.558660024 +0000 UTC m=+0.091593746 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:27:42 compute-1 sshd-session[212630]: Received disconnect from 85.209.134.43 port 59864:11: Bye Bye [preauth]
Nov 24 13:27:42 compute-1 sshd-session[212630]: Disconnected from authenticating user root 85.209.134.43 port 59864 [preauth]
Nov 24 13:27:42 compute-1 nova_compute[187078]: 2025-11-24 13:27:42.896 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:43 compute-1 nova_compute[187078]: 2025-11-24 13:27:43.320 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:47 compute-1 nova_compute[187078]: 2025-11-24 13:27:47.899 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:48 compute-1 sshd-session[212657]: Invalid user user1 from 68.183.82.237 port 46470
Nov 24 13:27:48 compute-1 sshd-session[212657]: Received disconnect from 68.183.82.237 port 46470:11: Bye Bye [preauth]
Nov 24 13:27:48 compute-1 sshd-session[212657]: Disconnected from invalid user user1 68.183.82.237 port 46470 [preauth]
Nov 24 13:27:48 compute-1 nova_compute[187078]: 2025-11-24 13:27:48.323 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:49 compute-1 openstack_network_exporter[199599]: ERROR   13:27:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:27:49 compute-1 openstack_network_exporter[199599]: ERROR   13:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:27:49 compute-1 openstack_network_exporter[199599]: ERROR   13:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:27:49 compute-1 openstack_network_exporter[199599]: ERROR   13:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:27:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:27:49 compute-1 openstack_network_exporter[199599]: ERROR   13:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:27:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:27:50 compute-1 podman[212659]: 2025-11-24 13:27:50.558351335 +0000 UTC m=+0.093880348 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, distribution-scope=public)
Nov 24 13:27:51 compute-1 nova_compute[187078]: 2025-11-24 13:27:51.632 187082 DEBUG nova.compute.manager [None req-e69ec2b9-b5a4-4354-b55a-461b67cbbaf1 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Nov 24 13:27:51 compute-1 nova_compute[187078]: 2025-11-24 13:27:51.678 187082 DEBUG nova.compute.provider_tree [None req-e69ec2b9-b5a4-4354-b55a-461b67cbbaf1 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 17 to 21 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:27:52 compute-1 nova_compute[187078]: 2025-11-24 13:27:52.935 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:53 compute-1 nova_compute[187078]: 2025-11-24 13:27:53.324 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:54 compute-1 sshd-session[212680]: Received disconnect from 176.114.89.34 port 45716:11: Bye Bye [preauth]
Nov 24 13:27:54 compute-1 sshd-session[212680]: Disconnected from authenticating user root 176.114.89.34 port 45716 [preauth]
Nov 24 13:27:55 compute-1 sshd-session[212682]: Invalid user user from 175.100.24.139 port 58768
Nov 24 13:27:55 compute-1 nova_compute[187078]: 2025-11-24 13:27:55.998 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Check if temp file /var/lib/nova/instances/tmpmabmhbp4 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 24 13:27:55 compute-1 nova_compute[187078]: 2025-11-24 13:27:55.998 187082 DEBUG nova.compute.manager [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmabmhbp4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='42898dd6-ca48-4a89-9d80-50ac0c9f5b0d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 24 13:27:56 compute-1 sshd-session[212682]: Received disconnect from 175.100.24.139 port 58768:11: Bye Bye [preauth]
Nov 24 13:27:56 compute-1 sshd-session[212682]: Disconnected from invalid user user 175.100.24.139 port 58768 [preauth]
Nov 24 13:27:56 compute-1 nova_compute[187078]: 2025-11-24 13:27:56.511 187082 DEBUG oslo_concurrency.processutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:56 compute-1 nova_compute[187078]: 2025-11-24 13:27:56.575 187082 DEBUG oslo_concurrency.processutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:56 compute-1 nova_compute[187078]: 2025-11-24 13:27:56.577 187082 DEBUG oslo_concurrency.processutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:27:56 compute-1 nova_compute[187078]: 2025-11-24 13:27:56.638 187082 DEBUG oslo_concurrency.processutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:27:57 compute-1 nova_compute[187078]: 2025-11-24 13:27:57.936 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:58 compute-1 nova_compute[187078]: 2025-11-24 13:27:58.326 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:58 compute-1 sshd-session[212691]: Accepted publickey for nova from 192.168.122.100 port 36712 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:27:58 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Nov 24 13:27:58 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 24 13:27:58 compute-1 systemd-logind[815]: New session 37 of user nova.
Nov 24 13:27:58 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 24 13:27:58 compute-1 systemd[1]: Starting User Manager for UID 42436...
Nov 24 13:27:58 compute-1 systemd[212695]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:27:58 compute-1 systemd[212695]: Queued start job for default target Main User Target.
Nov 24 13:27:58 compute-1 systemd[212695]: Created slice User Application Slice.
Nov 24 13:27:58 compute-1 systemd[212695]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:27:58 compute-1 systemd[212695]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:27:58 compute-1 systemd[212695]: Reached target Paths.
Nov 24 13:27:58 compute-1 systemd[212695]: Reached target Timers.
Nov 24 13:27:58 compute-1 systemd[212695]: Starting D-Bus User Message Bus Socket...
Nov 24 13:27:58 compute-1 systemd[212695]: Starting Create User's Volatile Files and Directories...
Nov 24 13:27:58 compute-1 systemd[212695]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:27:58 compute-1 systemd[212695]: Finished Create User's Volatile Files and Directories.
Nov 24 13:27:58 compute-1 systemd[212695]: Reached target Sockets.
Nov 24 13:27:58 compute-1 systemd[212695]: Reached target Basic System.
Nov 24 13:27:58 compute-1 systemd[212695]: Reached target Main User Target.
Nov 24 13:27:58 compute-1 systemd[212695]: Startup finished in 142ms.
Nov 24 13:27:58 compute-1 systemd[1]: Started User Manager for UID 42436.
Nov 24 13:27:58 compute-1 systemd[1]: Started Session 37 of User nova.
Nov 24 13:27:58 compute-1 sshd-session[212691]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:27:58 compute-1 sshd-session[212710]: Received disconnect from 192.168.122.100 port 36712:11: disconnected by user
Nov 24 13:27:58 compute-1 sshd-session[212710]: Disconnected from user nova 192.168.122.100 port 36712
Nov 24 13:27:58 compute-1 sshd-session[212691]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:27:58 compute-1 systemd[1]: session-37.scope: Deactivated successfully.
Nov 24 13:27:58 compute-1 systemd-logind[815]: Session 37 logged out. Waiting for processes to exit.
Nov 24 13:27:58 compute-1 systemd-logind[815]: Removed session 37.
Nov 24 13:27:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:59.928 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:27:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:27:59.930 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:27:59 compute-1 nova_compute[187078]: 2025-11-24 13:27:59.974 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:27:59 compute-1 nova_compute[187078]: 2025-11-24 13:27:59.979 187082 DEBUG nova.compute.manager [req-b259afe0-cff1-4fa0-bf6f-01dcc3827a16 req-bf955891-f60a-4c07-bb94-b89e94597725 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-unplugged-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:27:59 compute-1 nova_compute[187078]: 2025-11-24 13:27:59.979 187082 DEBUG oslo_concurrency.lockutils [req-b259afe0-cff1-4fa0-bf6f-01dcc3827a16 req-bf955891-f60a-4c07-bb94-b89e94597725 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:27:59 compute-1 nova_compute[187078]: 2025-11-24 13:27:59.979 187082 DEBUG oslo_concurrency.lockutils [req-b259afe0-cff1-4fa0-bf6f-01dcc3827a16 req-bf955891-f60a-4c07-bb94-b89e94597725 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:27:59 compute-1 nova_compute[187078]: 2025-11-24 13:27:59.980 187082 DEBUG oslo_concurrency.lockutils [req-b259afe0-cff1-4fa0-bf6f-01dcc3827a16 req-bf955891-f60a-4c07-bb94-b89e94597725 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:27:59 compute-1 nova_compute[187078]: 2025-11-24 13:27:59.980 187082 DEBUG nova.compute.manager [req-b259afe0-cff1-4fa0-bf6f-01dcc3827a16 req-bf955891-f60a-4c07-bb94-b89e94597725 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] No waiting events found dispatching network-vif-unplugged-e3a951ff-085f-4216-98f1-ff98fc0869ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:27:59 compute-1 nova_compute[187078]: 2025-11-24 13:27:59.980 187082 DEBUG nova.compute.manager [req-b259afe0-cff1-4fa0-bf6f-01dcc3827a16 req-bf955891-f60a-4c07-bb94-b89e94597725 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-unplugged-e3a951ff-085f-4216-98f1-ff98fc0869ad for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.239 187082 INFO nova.compute.manager [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Took 3.60 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.239 187082 DEBUG nova.compute.manager [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.256 187082 DEBUG nova.compute.manager [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmabmhbp4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='42898dd6-ca48-4a89-9d80-50ac0c9f5b0d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(288b0361-2859-42c6-a030-501985e8a1c0),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.276 187082 DEBUG nova.objects.instance [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.278 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.279 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.279 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.293 187082 DEBUG nova.virt.libvirt.vif [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:27:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-172964129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-172964129',id=12,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:27:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-a9a033rx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:27:09Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=42898dd6-ca48-4a89-9d80-50ac0c9f5b0d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.294 187082 DEBUG nova.network.os_vif_util [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.294 187082 DEBUG nova.network.os_vif_util [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:7a:8d,bridge_name='br-int',has_traffic_filtering=True,id=e3a951ff-085f-4216-98f1-ff98fc0869ad,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3a951ff-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.295 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Updating guest XML with vif config: <interface type="ethernet">
Nov 24 13:28:00 compute-1 nova_compute[187078]:   <mac address="fa:16:3e:43:7a:8d"/>
Nov 24 13:28:00 compute-1 nova_compute[187078]:   <model type="virtio"/>
Nov 24 13:28:00 compute-1 nova_compute[187078]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:28:00 compute-1 nova_compute[187078]:   <mtu size="1442"/>
Nov 24 13:28:00 compute-1 nova_compute[187078]:   <target dev="tape3a951ff-08"/>
Nov 24 13:28:00 compute-1 nova_compute[187078]: </interface>
Nov 24 13:28:00 compute-1 nova_compute[187078]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.295 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.782 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.783 187082 INFO nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 24 13:28:00 compute-1 nova_compute[187078]: 2025-11-24 13:28:00.864 187082 INFO nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 24 13:28:01 compute-1 nova_compute[187078]: 2025-11-24 13:28:01.368 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:28:01 compute-1 nova_compute[187078]: 2025-11-24 13:28:01.368 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:28:01 compute-1 nova_compute[187078]: 2025-11-24 13:28:01.871 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:28:01 compute-1 nova_compute[187078]: 2025-11-24 13:28:01.872 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:28:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:01.933 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.093 187082 DEBUG nova.compute.manager [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.094 187082 DEBUG oslo_concurrency.lockutils [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.094 187082 DEBUG oslo_concurrency.lockutils [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.094 187082 DEBUG oslo_concurrency.lockutils [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.094 187082 DEBUG nova.compute.manager [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] No waiting events found dispatching network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.095 187082 WARNING nova.compute.manager [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received unexpected event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad for instance with vm_state active and task_state migrating.
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.095 187082 DEBUG nova.compute.manager [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-changed-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.095 187082 DEBUG nova.compute.manager [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Refreshing instance network info cache due to event network-changed-e3a951ff-085f-4216-98f1-ff98fc0869ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.096 187082 DEBUG oslo_concurrency.lockutils [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.096 187082 DEBUG oslo_concurrency.lockutils [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.096 187082 DEBUG nova.network.neutron [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Refreshing network info cache for port e3a951ff-085f-4216-98f1-ff98fc0869ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.374 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.375 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.815 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990882.814645, 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.815 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] VM Paused (Lifecycle Event)
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.832 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.836 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.857 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.878 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.878 187082 DEBUG nova.virt.libvirt.migration [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.939 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:02 compute-1 kernel: tape3a951ff-08 (unregistering): left promiscuous mode
Nov 24 13:28:02 compute-1 NetworkManager[55527]: <info>  [1763990882.9850] device (tape3a951ff-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.990 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:02 compute-1 ovn_controller[95368]: 2025-11-24T13:28:02Z|00120|binding|INFO|Releasing lport e3a951ff-085f-4216-98f1-ff98fc0869ad from this chassis (sb_readonly=0)
Nov 24 13:28:02 compute-1 ovn_controller[95368]: 2025-11-24T13:28:02Z|00121|binding|INFO|Setting lport e3a951ff-085f-4216-98f1-ff98fc0869ad down in Southbound
Nov 24 13:28:02 compute-1 ovn_controller[95368]: 2025-11-24T13:28:02Z|00122|binding|INFO|Removing iface tape3a951ff-08 ovn-installed in OVS
Nov 24 13:28:02 compute-1 nova_compute[187078]: 2025-11-24 13:28:02.993 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:02.999 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:7a:8d 10.100.0.6'], port_security=['fa:16:3e:43:7a:8d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '42898dd6-ca48-4a89-9d80-50ac0c9f5b0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=e3a951ff-085f-4216-98f1-ff98fc0869ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.006 104225 INFO neutron.agent.ovn.metadata.agent [-] Port e3a951ff-085f-4216-98f1-ff98fc0869ad in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.008 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.009 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.009 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[1977e0d1-0749-48cf-997b-d40ba1fd03d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.011 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace which is not needed anymore
Nov 24 13:28:03 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 24 13:28:03 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Consumed 15.951s CPU time.
Nov 24 13:28:03 compute-1 systemd-machined[153355]: Machine qemu-9-instance-0000000c terminated.
Nov 24 13:28:03 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[212465]: [NOTICE]   (212469) : haproxy version is 2.8.14-c23fe91
Nov 24 13:28:03 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[212465]: [NOTICE]   (212469) : path to executable is /usr/sbin/haproxy
Nov 24 13:28:03 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[212465]: [WARNING]  (212469) : Exiting Master process...
Nov 24 13:28:03 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[212465]: [WARNING]  (212469) : Exiting Master process...
Nov 24 13:28:03 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[212465]: [ALERT]    (212469) : Current worker (212471) exited with code 143 (Terminated)
Nov 24 13:28:03 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[212465]: [WARNING]  (212469) : All workers exited. Exiting... (0)
Nov 24 13:28:03 compute-1 systemd[1]: libpod-b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d.scope: Deactivated successfully.
Nov 24 13:28:03 compute-1 conmon[212465]: conmon b5ddca5380c9b1da3816 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d.scope/container/memory.events
Nov 24 13:28:03 compute-1 podman[212754]: 2025-11-24 13:28:03.1496518 +0000 UTC m=+0.043360978 container died b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:28:03 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d-userdata-shm.mount: Deactivated successfully.
Nov 24 13:28:03 compute-1 systemd[1]: var-lib-containers-storage-overlay-a95dd72ef7ddad9ff3ee680468e4403f3a0f573e817d7aacea379c20273f3c06-merged.mount: Deactivated successfully.
Nov 24 13:28:03 compute-1 podman[212754]: 2025-11-24 13:28:03.194323232 +0000 UTC m=+0.088032410 container cleanup b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 13:28:03 compute-1 systemd[1]: libpod-conmon-b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d.scope: Deactivated successfully.
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.217 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.217 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.217 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 24 13:28:03 compute-1 podman[212797]: 2025-11-24 13:28:03.28155359 +0000 UTC m=+0.061808170 container remove b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.287 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3adebdd4-768f-4cb7-96fe-48c71408106c]: (4, ('Mon Nov 24 01:28:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d)\nb5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d\nMon Nov 24 01:28:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (b5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d)\nb5ddca5380c9b1da3816bebd51a0ab36a9d22ed76102d94264820b52707ab86d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.289 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[6be9ba0a-379e-49f9-867c-708228b7d1a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.291 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.340 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:03 compute-1 kernel: tapee6bf4e1-a0: left promiscuous mode
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.358 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.360 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[16237377-e2ca-4eff-afe7-282da428c752]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.380 187082 DEBUG nova.virt.libvirt.guest [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '42898dd6-ca48-4a89-9d80-50ac0c9f5b0d' (instance-0000000c) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.380 187082 INFO nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Migration operation has completed
Nov 24 13:28:03 compute-1 nova_compute[187078]: 2025-11-24 13:28:03.381 187082 INFO nova.compute.manager [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] _post_live_migration() is started..
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.380 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9a2325-c2b0-4024-843e-cf6cb03f0a7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.382 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2d2d05fb-5c7d-4691-8bd3-92d66cf5da19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.398 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[88a567d1-4d3e-4ce0-9e04-e67de82d4dce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384764, 'reachable_time': 18966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212817, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:03 compute-1 systemd[1]: run-netns-ovnmeta\x2dee6bf4e1\x2dadcd\x2d4f6c\x2d8b46\x2deaa71e64e9c0.mount: Deactivated successfully.
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.404 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:28:03 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:03.404 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[17be775e-ffcb-420e-88ea-9aab12ff8fd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:04.155 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:04.156 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:04.156 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:04 compute-1 nova_compute[187078]: 2025-11-24 13:28:04.368 187082 DEBUG nova.compute.manager [req-4b23558c-9103-4280-80e5-e9ad554a895f req-8ac950a1-4b2e-475b-bb3b-781f77fe1490 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-unplugged-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:04 compute-1 nova_compute[187078]: 2025-11-24 13:28:04.368 187082 DEBUG oslo_concurrency.lockutils [req-4b23558c-9103-4280-80e5-e9ad554a895f req-8ac950a1-4b2e-475b-bb3b-781f77fe1490 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:04 compute-1 nova_compute[187078]: 2025-11-24 13:28:04.369 187082 DEBUG oslo_concurrency.lockutils [req-4b23558c-9103-4280-80e5-e9ad554a895f req-8ac950a1-4b2e-475b-bb3b-781f77fe1490 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:04 compute-1 nova_compute[187078]: 2025-11-24 13:28:04.369 187082 DEBUG oslo_concurrency.lockutils [req-4b23558c-9103-4280-80e5-e9ad554a895f req-8ac950a1-4b2e-475b-bb3b-781f77fe1490 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:04 compute-1 nova_compute[187078]: 2025-11-24 13:28:04.369 187082 DEBUG nova.compute.manager [req-4b23558c-9103-4280-80e5-e9ad554a895f req-8ac950a1-4b2e-475b-bb3b-781f77fe1490 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] No waiting events found dispatching network-vif-unplugged-e3a951ff-085f-4216-98f1-ff98fc0869ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:28:04 compute-1 nova_compute[187078]: 2025-11-24 13:28:04.369 187082 DEBUG nova.compute.manager [req-4b23558c-9103-4280-80e5-e9ad554a895f req-8ac950a1-4b2e-475b-bb3b-781f77fe1490 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-unplugged-e3a951ff-085f-4216-98f1-ff98fc0869ad for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:28:05 compute-1 podman[197429]: time="2025-11-24T13:28:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:28:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:28:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:28:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:28:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Nov 24 13:28:05 compute-1 nova_compute[187078]: 2025-11-24 13:28:05.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.347 187082 DEBUG nova.network.neutron [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Activated binding for port e3a951ff-085f-4216-98f1-ff98fc0869ad and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.348 187082 DEBUG nova.compute.manager [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.348 187082 DEBUG nova.virt.libvirt.vif [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:27:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-172964129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-172964129',id=12,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:27:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-a9a033rx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:27:54Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=42898dd6-ca48-4a89-9d80-50ac0c9f5b0d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.348 187082 DEBUG nova.network.os_vif_util [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.349 187082 DEBUG nova.network.os_vif_util [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:7a:8d,bridge_name='br-int',has_traffic_filtering=True,id=e3a951ff-085f-4216-98f1-ff98fc0869ad,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3a951ff-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.349 187082 DEBUG os_vif [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:7a:8d,bridge_name='br-int',has_traffic_filtering=True,id=e3a951ff-085f-4216-98f1-ff98fc0869ad,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3a951ff-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.352 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.352 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3a951ff-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.353 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.355 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.358 187082 INFO os_vif [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:7a:8d,bridge_name='br-int',has_traffic_filtering=True,id=e3a951ff-085f-4216-98f1-ff98fc0869ad,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3a951ff-08')
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.359 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.359 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.360 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.360 187082 DEBUG nova.compute.manager [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.361 187082 INFO nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Deleting instance files /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d_del
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.362 187082 INFO nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Deletion of /var/lib/nova/instances/42898dd6-ca48-4a89-9d80-50ac0c9f5b0d_del complete
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.492 187082 DEBUG nova.compute.manager [req-4ccaac12-3d7a-44c2-9989-3174deaf7a6f req-a1a93a0e-911e-4b3d-91e0-1842b019f92a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.492 187082 DEBUG oslo_concurrency.lockutils [req-4ccaac12-3d7a-44c2-9989-3174deaf7a6f req-a1a93a0e-911e-4b3d-91e0-1842b019f92a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.492 187082 DEBUG oslo_concurrency.lockutils [req-4ccaac12-3d7a-44c2-9989-3174deaf7a6f req-a1a93a0e-911e-4b3d-91e0-1842b019f92a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.493 187082 DEBUG oslo_concurrency.lockutils [req-4ccaac12-3d7a-44c2-9989-3174deaf7a6f req-a1a93a0e-911e-4b3d-91e0-1842b019f92a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.493 187082 DEBUG nova.compute.manager [req-4ccaac12-3d7a-44c2-9989-3174deaf7a6f req-a1a93a0e-911e-4b3d-91e0-1842b019f92a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] No waiting events found dispatching network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.493 187082 WARNING nova.compute.manager [req-4ccaac12-3d7a-44c2-9989-3174deaf7a6f req-a1a93a0e-911e-4b3d-91e0-1842b019f92a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received unexpected event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad for instance with vm_state active and task_state migrating.
Nov 24 13:28:06 compute-1 podman[212818]: 2025-11-24 13:28:06.548645203 +0000 UTC m=+0.072744703 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:28:06 compute-1 podman[212819]: 2025-11-24 13:28:06.557656753 +0000 UTC m=+0.073849243 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.766 187082 DEBUG nova.network.neutron [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Updated VIF entry in instance network info cache for port e3a951ff-085f-4216-98f1-ff98fc0869ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.766 187082 DEBUG nova.network.neutron [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Updating instance_info_cache with network_info: [{"id": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "address": "fa:16:3e:43:7a:8d", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3a951ff-08", "ovs_interfaceid": "e3a951ff-085f-4216-98f1-ff98fc0869ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:28:06 compute-1 nova_compute[187078]: 2025-11-24 13:28:06.783 187082 DEBUG oslo_concurrency.lockutils [req-86a48f76-d9e5-47ba-93ac-b7a46d733cd2 req-0fca102b-8261-4cfe-891b-4b71930751bf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-42898dd6-ca48-4a89-9d80-50ac0c9f5b0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:28:07 compute-1 nova_compute[187078]: 2025-11-24 13:28:07.942 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.564 187082 DEBUG nova.compute.manager [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.565 187082 DEBUG oslo_concurrency.lockutils [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.565 187082 DEBUG oslo_concurrency.lockutils [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.566 187082 DEBUG oslo_concurrency.lockutils [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.566 187082 DEBUG nova.compute.manager [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] No waiting events found dispatching network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.567 187082 WARNING nova.compute.manager [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received unexpected event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad for instance with vm_state active and task_state migrating.
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.568 187082 DEBUG nova.compute.manager [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.568 187082 DEBUG oslo_concurrency.lockutils [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.569 187082 DEBUG oslo_concurrency.lockutils [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.569 187082 DEBUG oslo_concurrency.lockutils [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.570 187082 DEBUG nova.compute.manager [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] No waiting events found dispatching network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:28:08 compute-1 nova_compute[187078]: 2025-11-24 13:28:08.570 187082 WARNING nova.compute.manager [req-1c457a2e-2c02-461b-9f95-7273b3124e42 req-eb2d57cd-c399-47dd-9748-60735b6ddf16 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Received unexpected event network-vif-plugged-e3a951ff-085f-4216-98f1-ff98fc0869ad for instance with vm_state active and task_state migrating.
Nov 24 13:28:08 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Nov 24 13:28:08 compute-1 systemd[212695]: Activating special unit Exit the Session...
Nov 24 13:28:08 compute-1 systemd[212695]: Stopped target Main User Target.
Nov 24 13:28:08 compute-1 systemd[212695]: Stopped target Basic System.
Nov 24 13:28:08 compute-1 systemd[212695]: Stopped target Paths.
Nov 24 13:28:08 compute-1 systemd[212695]: Stopped target Sockets.
Nov 24 13:28:08 compute-1 systemd[212695]: Stopped target Timers.
Nov 24 13:28:08 compute-1 systemd[212695]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:28:08 compute-1 systemd[212695]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:28:08 compute-1 systemd[212695]: Closed D-Bus User Message Bus Socket.
Nov 24 13:28:08 compute-1 systemd[212695]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:28:08 compute-1 systemd[212695]: Removed slice User Application Slice.
Nov 24 13:28:08 compute-1 systemd[212695]: Reached target Shutdown.
Nov 24 13:28:08 compute-1 systemd[212695]: Finished Exit the Session.
Nov 24 13:28:08 compute-1 systemd[212695]: Reached target Exit the Session.
Nov 24 13:28:08 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Nov 24 13:28:08 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Nov 24 13:28:08 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 24 13:28:08 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 24 13:28:08 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 24 13:28:08 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 24 13:28:08 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Nov 24 13:28:10 compute-1 nova_compute[187078]: 2025-11-24 13:28:10.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:10 compute-1 nova_compute[187078]: 2025-11-24 13:28:10.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.355 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.707 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.708 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.708 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "42898dd6-ca48-4a89-9d80-50ac0c9f5b0d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.726 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.726 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.727 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.727 187082 DEBUG nova.compute.resource_tracker [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.896 187082 WARNING nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.897 187082 DEBUG nova.compute.resource_tracker [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5889MB free_disk=73.4599723815918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.898 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.898 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.947 187082 DEBUG nova.compute.resource_tracker [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration for instance 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:28:11 compute-1 nova_compute[187078]: 2025-11-24 13:28:11.967 187082 DEBUG nova.compute.resource_tracker [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.006 187082 DEBUG nova.compute.resource_tracker [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration 288b0361-2859-42c6-a030-501985e8a1c0 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.007 187082 DEBUG nova.compute.resource_tracker [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.007 187082 DEBUG nova.compute.resource_tracker [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.057 187082 DEBUG nova.compute.provider_tree [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.071 187082 DEBUG nova.scheduler.client.report [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.087 187082 DEBUG nova.compute.resource_tracker [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.088 187082 DEBUG oslo_concurrency.lockutils [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.092 187082 INFO nova.compute.manager [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.157 187082 INFO nova.scheduler.client.report [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration 288b0361-2859-42c6-a030-501985e8a1c0
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.158 187082 DEBUG nova.virt.libvirt.driver [None req-a739cedc-18c3-4a56-b27e-9191f41c6607 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.689 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.689 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.882 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.883 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5889MB free_disk=73.4599723815918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.883 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.883 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.933 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.934 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.943 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.966 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.978 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.980 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:28:12 compute-1 nova_compute[187078]: 2025-11-24 13:28:12.980 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:13 compute-1 podman[212862]: 2025-11-24 13:28:13.519694076 +0000 UTC m=+0.063405073 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd)
Nov 24 13:28:13 compute-1 podman[212863]: 2025-11-24 13:28:13.57078624 +0000 UTC m=+0.113660144 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 24 13:28:13 compute-1 nova_compute[187078]: 2025-11-24 13:28:13.980 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:13 compute-1 nova_compute[187078]: 2025-11-24 13:28:13.980 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:28:13 compute-1 nova_compute[187078]: 2025-11-24 13:28:13.980 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:28:13 compute-1 nova_compute[187078]: 2025-11-24 13:28:13.991 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:28:13 compute-1 nova_compute[187078]: 2025-11-24 13:28:13.992 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:14 compute-1 nova_compute[187078]: 2025-11-24 13:28:14.671 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:15 compute-1 nova_compute[187078]: 2025-11-24 13:28:15.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:15 compute-1 nova_compute[187078]: 2025-11-24 13:28:15.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:15 compute-1 nova_compute[187078]: 2025-11-24 13:28:15.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:28:16 compute-1 nova_compute[187078]: 2025-11-24 13:28:16.358 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:17 compute-1 nova_compute[187078]: 2025-11-24 13:28:17.661 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:28:17 compute-1 nova_compute[187078]: 2025-11-24 13:28:17.945 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:18 compute-1 nova_compute[187078]: 2025-11-24 13:28:18.213 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763990883.2128975, 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:28:18 compute-1 nova_compute[187078]: 2025-11-24 13:28:18.214 187082 INFO nova.compute.manager [-] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] VM Stopped (Lifecycle Event)
Nov 24 13:28:18 compute-1 nova_compute[187078]: 2025-11-24 13:28:18.236 187082 DEBUG nova.compute.manager [None req-1d9b179d-5fca-4138-84b5-9c4be9856cfd - - - - - -] [instance: 42898dd6-ca48-4a89-9d80-50ac0c9f5b0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:28:19 compute-1 openstack_network_exporter[199599]: ERROR   13:28:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:28:19 compute-1 openstack_network_exporter[199599]: ERROR   13:28:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:28:19 compute-1 openstack_network_exporter[199599]: ERROR   13:28:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:28:19 compute-1 openstack_network_exporter[199599]: ERROR   13:28:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:28:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:28:19 compute-1 openstack_network_exporter[199599]: ERROR   13:28:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:28:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:28:21 compute-1 nova_compute[187078]: 2025-11-24 13:28:21.360 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:21 compute-1 podman[212907]: 2025-11-24 13:28:21.51993175 +0000 UTC m=+0.064518154 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Nov 24 13:28:22 compute-1 nova_compute[187078]: 2025-11-24 13:28:22.947 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:26 compute-1 nova_compute[187078]: 2025-11-24 13:28:26.363 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:27 compute-1 nova_compute[187078]: 2025-11-24 13:28:27.950 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:31 compute-1 nova_compute[187078]: 2025-11-24 13:28:31.445 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:32 compute-1 nova_compute[187078]: 2025-11-24 13:28:32.985 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:34 compute-1 nova_compute[187078]: 2025-11-24 13:28:34.944 187082 DEBUG nova.compute.manager [None req-5f0ba0f4-d454-44f8-a470-fdd765f922bc f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Nov 24 13:28:34 compute-1 nova_compute[187078]: 2025-11-24 13:28:34.986 187082 DEBUG nova.compute.provider_tree [None req-5f0ba0f4-d454-44f8-a470-fdd765f922bc f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 21 to 24 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:28:35 compute-1 podman[197429]: time="2025-11-24T13:28:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:28:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:28:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:28:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:28:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Nov 24 13:28:36 compute-1 nova_compute[187078]: 2025-11-24 13:28:36.448 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:36 compute-1 sshd-session[212929]: Invalid user ftpuser1 from 45.78.217.131 port 33030
Nov 24 13:28:36 compute-1 podman[212932]: 2025-11-24 13:28:36.865206133 +0000 UTC m=+0.048719382 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 24 13:28:36 compute-1 podman[212931]: 2025-11-24 13:28:36.867995098 +0000 UTC m=+0.055487642 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:28:37 compute-1 sshd-session[212929]: Received disconnect from 45.78.217.131 port 33030:11: Bye Bye [preauth]
Nov 24 13:28:37 compute-1 sshd-session[212929]: Disconnected from invalid user ftpuser1 45.78.217.131 port 33030 [preauth]
Nov 24 13:28:37 compute-1 nova_compute[187078]: 2025-11-24 13:28:37.986 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:39 compute-1 sshd-session[212972]: Invalid user zmarin from 85.209.134.43 port 49912
Nov 24 13:28:39 compute-1 sshd-session[212972]: Received disconnect from 85.209.134.43 port 49912:11: Bye Bye [preauth]
Nov 24 13:28:39 compute-1 sshd-session[212972]: Disconnected from invalid user zmarin 85.209.134.43 port 49912 [preauth]
Nov 24 13:28:41 compute-1 nova_compute[187078]: 2025-11-24 13:28:41.452 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:42 compute-1 nova_compute[187078]: 2025-11-24 13:28:42.987 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:44 compute-1 podman[212976]: 2025-11-24 13:28:44.510252759 +0000 UTC m=+0.057756613 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Nov 24 13:28:44 compute-1 podman[212977]: 2025-11-24 13:28:44.572193362 +0000 UTC m=+0.119021498 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 24 13:28:44 compute-1 sshd-session[212974]: Invalid user temp from 5.198.176.28 port 44446
Nov 24 13:28:44 compute-1 sshd-session[212974]: Received disconnect from 5.198.176.28 port 44446:11: Bye Bye [preauth]
Nov 24 13:28:44 compute-1 sshd-session[212974]: Disconnected from invalid user temp 5.198.176.28 port 44446 [preauth]
Nov 24 13:28:46 compute-1 nova_compute[187078]: 2025-11-24 13:28:46.454 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:47 compute-1 ovn_controller[95368]: 2025-11-24T13:28:47Z|00123|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 24 13:28:48 compute-1 nova_compute[187078]: 2025-11-24 13:28:48.013 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:49 compute-1 openstack_network_exporter[199599]: ERROR   13:28:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:28:49 compute-1 openstack_network_exporter[199599]: ERROR   13:28:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:28:49 compute-1 openstack_network_exporter[199599]: ERROR   13:28:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:28:49 compute-1 openstack_network_exporter[199599]: ERROR   13:28:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:28:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:28:49 compute-1 openstack_network_exporter[199599]: ERROR   13:28:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:28:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:28:50 compute-1 nova_compute[187078]: 2025-11-24 13:28:50.840 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:50 compute-1 nova_compute[187078]: 2025-11-24 13:28:50.840 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:50 compute-1 nova_compute[187078]: 2025-11-24 13:28:50.889 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.016 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.017 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.026 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.026 187082 INFO nova.compute.claims [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.310 187082 DEBUG nova.compute.provider_tree [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.340 187082 DEBUG nova.scheduler.client.report [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.434 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.417s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.435 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.490 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.491 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.492 187082 DEBUG nova.network.neutron [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.514 187082 INFO nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.589 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.773 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.774 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.774 187082 INFO nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Creating image(s)
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.775 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "/var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.776 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.776 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.788 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.861 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.863 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.863 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.876 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.935 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:28:51 compute-1 nova_compute[187078]: 2025-11-24 13:28:51.937 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.166 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk 1073741824" returned: 0 in 0.229s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.169 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.170 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.249 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.252 187082 DEBUG nova.virt.disk.api [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Checking if we can resize image /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.253 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.289 187082 DEBUG nova.policy [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44609a4d2fa941a4b26d6b27a5d4a6d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a66bcdc071b741ef8709a4608acd6051', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.335 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.336 187082 DEBUG nova.virt.disk.api [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Cannot resize image /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.336 187082 DEBUG nova.objects.instance [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'migration_context' on Instance uuid 2975848d-b193-4147-9775-3861b433ffd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.365 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.366 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Ensure instance console log exists: /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.366 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.367 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:52 compute-1 nova_compute[187078]: 2025-11-24 13:28:52.367 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:52 compute-1 podman[213037]: 2025-11-24 13:28:52.576950448 +0000 UTC m=+0.112966646 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 13:28:53 compute-1 nova_compute[187078]: 2025-11-24 13:28:53.014 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:53 compute-1 nova_compute[187078]: 2025-11-24 13:28:53.599 187082 DEBUG nova.network.neutron [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Successfully created port: b67799d4-0548-47ba-a2a4-2f8dd8402dd6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:28:55 compute-1 nova_compute[187078]: 2025-11-24 13:28:55.088 187082 DEBUG nova.network.neutron [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Successfully updated port: b67799d4-0548-47ba-a2a4-2f8dd8402dd6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:28:55 compute-1 nova_compute[187078]: 2025-11-24 13:28:55.108 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:28:55 compute-1 nova_compute[187078]: 2025-11-24 13:28:55.109 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquired lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:28:55 compute-1 nova_compute[187078]: 2025-11-24 13:28:55.109 187082 DEBUG nova.network.neutron [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:28:55 compute-1 nova_compute[187078]: 2025-11-24 13:28:55.165 187082 DEBUG nova.compute.manager [req-308a7545-9652-46d5-9f45-bd67631872d1 req-bfb434d6-bbec-42ab-a68c-7a9ca0aa9c5d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-changed-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:55 compute-1 nova_compute[187078]: 2025-11-24 13:28:55.166 187082 DEBUG nova.compute.manager [req-308a7545-9652-46d5-9f45-bd67631872d1 req-bfb434d6-bbec-42ab-a68c-7a9ca0aa9c5d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Refreshing instance network info cache due to event network-changed-b67799d4-0548-47ba-a2a4-2f8dd8402dd6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:28:55 compute-1 nova_compute[187078]: 2025-11-24 13:28:55.166 187082 DEBUG oslo_concurrency.lockutils [req-308a7545-9652-46d5-9f45-bd67631872d1 req-bfb434d6-bbec-42ab-a68c-7a9ca0aa9c5d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:28:55 compute-1 nova_compute[187078]: 2025-11-24 13:28:55.242 187082 DEBUG nova.network.neutron [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.255 187082 DEBUG nova.network.neutron [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updating instance_info_cache with network_info: [{"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.276 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Releasing lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.277 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Instance network_info: |[{"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.278 187082 DEBUG oslo_concurrency.lockutils [req-308a7545-9652-46d5-9f45-bd67631872d1 req-bfb434d6-bbec-42ab-a68c-7a9ca0aa9c5d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.278 187082 DEBUG nova.network.neutron [req-308a7545-9652-46d5-9f45-bd67631872d1 req-bfb434d6-bbec-42ab-a68c-7a9ca0aa9c5d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Refreshing network info cache for port b67799d4-0548-47ba-a2a4-2f8dd8402dd6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.281 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Start _get_guest_xml network_info=[{"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.285 187082 WARNING nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.291 187082 DEBUG nova.virt.libvirt.host [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.292 187082 DEBUG nova.virt.libvirt.host [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.297 187082 DEBUG nova.virt.libvirt.host [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.298 187082 DEBUG nova.virt.libvirt.host [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.299 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.299 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.300 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.300 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.300 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.301 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.301 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.301 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.301 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.302 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.302 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.302 187082 DEBUG nova.virt.hardware [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.306 187082 DEBUG nova.virt.libvirt.vif [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-376350173',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-376350173',id=13,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-4dofsm6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:28:51Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=2975848d-b193-4147-9775-3861b433ffd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.306 187082 DEBUG nova.network.os_vif_util [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.307 187082 DEBUG nova.network.os_vif_util [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:50:20,bridge_name='br-int',has_traffic_filtering=True,id=b67799d4-0548-47ba-a2a4-2f8dd8402dd6,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb67799d4-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.308 187082 DEBUG nova.objects.instance [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2975848d-b193-4147-9775-3861b433ffd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.325 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <uuid>2975848d-b193-4147-9775-3861b433ffd1</uuid>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <name>instance-0000000d</name>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteStrategies-server-376350173</nova:name>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:28:56</nova:creationTime>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:28:56 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:28:56 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:28:56 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:28:56 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:28:56 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:28:56 compute-1 nova_compute[187078]:         <nova:user uuid="44609a4d2fa941a4b26d6b27a5d4a6d2">tempest-TestExecuteStrategies-392394962-project-member</nova:user>
Nov 24 13:28:56 compute-1 nova_compute[187078]:         <nova:project uuid="a66bcdc071b741ef8709a4608acd6051">tempest-TestExecuteStrategies-392394962</nova:project>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:28:56 compute-1 nova_compute[187078]:         <nova:port uuid="b67799d4-0548-47ba-a2a4-2f8dd8402dd6">
Nov 24 13:28:56 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <system>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <entry name="serial">2975848d-b193-4147-9775-3861b433ffd1</entry>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <entry name="uuid">2975848d-b193-4147-9775-3861b433ffd1</entry>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </system>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <os>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   </os>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <features>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   </features>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk.config"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:37:50:20"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <target dev="tapb67799d4-05"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/console.log" append="off"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <video>
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </video>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:28:56 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:28:56 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:28:56 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:28:56 compute-1 nova_compute[187078]: </domain>
Nov 24 13:28:56 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.327 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Preparing to wait for external event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.327 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.327 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.327 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.328 187082 DEBUG nova.virt.libvirt.vif [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-376350173',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-376350173',id=13,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-4dofsm6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:28:51Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=2975848d-b193-4147-9775-3861b433ffd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.328 187082 DEBUG nova.network.os_vif_util [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.329 187082 DEBUG nova.network.os_vif_util [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:50:20,bridge_name='br-int',has_traffic_filtering=True,id=b67799d4-0548-47ba-a2a4-2f8dd8402dd6,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb67799d4-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.329 187082 DEBUG os_vif [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:50:20,bridge_name='br-int',has_traffic_filtering=True,id=b67799d4-0548-47ba-a2a4-2f8dd8402dd6,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb67799d4-05') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.329 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.330 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.330 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.333 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.333 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb67799d4-05, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.333 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb67799d4-05, col_values=(('external_ids', {'iface-id': 'b67799d4-0548-47ba-a2a4-2f8dd8402dd6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:50:20', 'vm-uuid': '2975848d-b193-4147-9775-3861b433ffd1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.428 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:56 compute-1 NetworkManager[55527]: <info>  [1763990936.4296] manager: (tapb67799d4-05): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.432 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.435 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.436 187082 INFO os_vif [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:50:20,bridge_name='br-int',has_traffic_filtering=True,id=b67799d4-0548-47ba-a2a4-2f8dd8402dd6,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb67799d4-05')
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.510 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.511 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.511 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No VIF found with MAC fa:16:3e:37:50:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:28:56 compute-1 nova_compute[187078]: 2025-11-24 13:28:56.511 187082 INFO nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Using config drive
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.268 187082 INFO nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Creating config drive at /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk.config
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.273 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvqxziej6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.403 187082 DEBUG oslo_concurrency.processutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvqxziej6" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:28:57 compute-1 kernel: tapb67799d4-05: entered promiscuous mode
Nov 24 13:28:57 compute-1 NetworkManager[55527]: <info>  [1763990937.5022] manager: (tapb67799d4-05): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Nov 24 13:28:57 compute-1 systemd-udevd[213076]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.547 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:57 compute-1 ovn_controller[95368]: 2025-11-24T13:28:57Z|00124|binding|INFO|Claiming lport b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for this chassis.
Nov 24 13:28:57 compute-1 ovn_controller[95368]: 2025-11-24T13:28:57Z|00125|binding|INFO|b67799d4-0548-47ba-a2a4-2f8dd8402dd6: Claiming fa:16:3e:37:50:20 10.100.0.12
Nov 24 13:28:57 compute-1 ovn_controller[95368]: 2025-11-24T13:28:57Z|00126|binding|INFO|Setting lport b67799d4-0548-47ba-a2a4-2f8dd8402dd6 ovn-installed in OVS
Nov 24 13:28:57 compute-1 ovn_controller[95368]: 2025-11-24T13:28:57Z|00127|binding|INFO|Setting lport b67799d4-0548-47ba-a2a4-2f8dd8402dd6 up in Southbound
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.561 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.556 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:50:20 10.100.0.12'], port_security=['fa:16:3e:37:50:20 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2975848d-b193-4147-9775-3861b433ffd1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=b67799d4-0548-47ba-a2a4-2f8dd8402dd6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.558 104225 INFO neutron.agent.ovn.metadata.agent [-] Port b67799d4-0548-47ba-a2a4-2f8dd8402dd6 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.559 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:28:57 compute-1 NetworkManager[55527]: <info>  [1763990937.5667] device (tapb67799d4-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:28:57 compute-1 NetworkManager[55527]: <info>  [1763990937.5681] device (tapb67799d4-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.571 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8a2a051d-ed15-4152-b99e-0eab2ae3e2b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.572 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee6bf4e1-a1 in ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.575 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee6bf4e1-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.575 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[db7f3465-5084-4834-9ae3-2b42360ffa3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.576 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5284b742-e955-4a55-80f9-71cec28fa7ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 systemd-machined[153355]: New machine qemu-10-instance-0000000d.
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.590 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[869651cc-08dd-4475-8fe5-f38fc24908e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-0000000d.
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.604 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[11e47293-28b9-4850-97d8-114d9c854194]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.644 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[14000e16-972b-48f5-b60b-0c7e95bb0a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 NetworkManager[55527]: <info>  [1763990937.6530] manager: (tapee6bf4e1-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.652 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b5fe25a8-7c81-413d-b906-b7179f5f81be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.684 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[c808266b-516a-406d-8a4f-8b66a5987a12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.687 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[345e1dba-2042-4f22-b35d-8c152c1e83ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 NetworkManager[55527]: <info>  [1763990937.7103] device (tapee6bf4e1-a0): carrier: link connected
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.717 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[c97b38e5-dd0f-44b4-9821-f22908049cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.734 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3f1690cd-b3ee-48b6-80d2-80b514e74d95]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395637, 'reachable_time': 34678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213112, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.739 187082 DEBUG nova.network.neutron [req-308a7545-9652-46d5-9f45-bd67631872d1 req-bfb434d6-bbec-42ab-a68c-7a9ca0aa9c5d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updated VIF entry in instance network info cache for port b67799d4-0548-47ba-a2a4-2f8dd8402dd6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.739 187082 DEBUG nova.network.neutron [req-308a7545-9652-46d5-9f45-bd67631872d1 req-bfb434d6-bbec-42ab-a68c-7a9ca0aa9c5d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updating instance_info_cache with network_info: [{"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.743 187082 DEBUG nova.compute.manager [req-da3101c5-50ee-4c91-8c9a-38ce08bb841c req-e3de91f3-5096-4a72-9d68-c98953a3a8a7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.744 187082 DEBUG oslo_concurrency.lockutils [req-da3101c5-50ee-4c91-8c9a-38ce08bb841c req-e3de91f3-5096-4a72-9d68-c98953a3a8a7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.744 187082 DEBUG oslo_concurrency.lockutils [req-da3101c5-50ee-4c91-8c9a-38ce08bb841c req-e3de91f3-5096-4a72-9d68-c98953a3a8a7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.744 187082 DEBUG oslo_concurrency.lockutils [req-da3101c5-50ee-4c91-8c9a-38ce08bb841c req-e3de91f3-5096-4a72-9d68-c98953a3a8a7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.745 187082 DEBUG nova.compute.manager [req-da3101c5-50ee-4c91-8c9a-38ce08bb841c req-e3de91f3-5096-4a72-9d68-c98953a3a8a7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Processing event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.750 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a0740a0a-ac10-474c-965e-412b03ede2d4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:5bc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395637, 'tstamp': 395637}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213113, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.754 187082 DEBUG oslo_concurrency.lockutils [req-308a7545-9652-46d5-9f45-bd67631872d1 req-bfb434d6-bbec-42ab-a68c-7a9ca0aa9c5d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.768 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f909aa69-ec65-46e3-b996-429ca0f8e376]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395637, 'reachable_time': 34678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213114, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.807 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[30a1efde-fc13-4ce1-82f7-c1704888d2f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.883 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[06efddd4-69d0-4dbd-b421-d421301b74e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.885 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.886 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.886 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.889 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:57 compute-1 NetworkManager[55527]: <info>  [1763990937.8904] manager: (tapee6bf4e1-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 24 13:28:57 compute-1 kernel: tapee6bf4e1-a0: entered promiscuous mode
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.894 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.895 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:57 compute-1 ovn_controller[95368]: 2025-11-24T13:28:57Z|00128|binding|INFO|Releasing lport 3f7bb31c-e9f4-4c4a-ad4a-8451f233926d from this chassis (sb_readonly=0)
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.896 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.898 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[1a21041f-a65e-4ede-b8d3-37605c0827c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.899 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:28:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:28:57.900 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'env', 'PROCESS_TAG=haproxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.907 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.912 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.913 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990937.9115667, 2975848d-b193-4147-9775-3861b433ffd1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.913 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] VM Started (Lifecycle Event)
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.917 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.923 187082 INFO nova.virt.libvirt.driver [-] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Instance spawned successfully.
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.924 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.929 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.936 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.941 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.941 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.941 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.942 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.942 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.943 187082 DEBUG nova.virt.libvirt.driver [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.949 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.949 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990937.9118714, 2975848d-b193-4147-9775-3861b433ffd1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.949 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] VM Paused (Lifecycle Event)
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.968 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.972 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763990937.9185474, 2975848d-b193-4147-9775-3861b433ffd1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.972 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] VM Resumed (Lifecycle Event)
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.990 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.995 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.998 187082 INFO nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Took 6.22 seconds to spawn the instance on the hypervisor.
Nov 24 13:28:57 compute-1 nova_compute[187078]: 2025-11-24 13:28:57.998 187082 DEBUG nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:28:58 compute-1 nova_compute[187078]: 2025-11-24 13:28:58.016 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:28:58 compute-1 nova_compute[187078]: 2025-11-24 13:28:58.020 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:28:58 compute-1 nova_compute[187078]: 2025-11-24 13:28:58.050 187082 INFO nova.compute.manager [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Took 7.07 seconds to build instance.
Nov 24 13:28:58 compute-1 nova_compute[187078]: 2025-11-24 13:28:58.064 187082 DEBUG oslo_concurrency.lockutils [None req-1318eaae-7af0-41e8-9d65-df742d96f762 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:58 compute-1 podman[213153]: 2025-11-24 13:28:58.272515938 +0000 UTC m=+0.025090300 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:28:58 compute-1 podman[213153]: 2025-11-24 13:28:58.663200387 +0000 UTC m=+0.415774719 container create c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:28:58 compute-1 systemd[1]: Started libpod-conmon-c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682.scope.
Nov 24 13:28:58 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:28:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29a0b0802e9dcd9e56e938e89673c5eb582a1bd0be5a36bbb83261099a5e5b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:28:58 compute-1 podman[213153]: 2025-11-24 13:28:58.934366484 +0000 UTC m=+0.686940806 container init c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 13:28:58 compute-1 podman[213153]: 2025-11-24 13:28:58.940510818 +0000 UTC m=+0.693085120 container start c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 13:28:58 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213169]: [NOTICE]   (213173) : New worker (213175) forked
Nov 24 13:28:58 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213169]: [NOTICE]   (213173) : Loading success.
Nov 24 13:28:59 compute-1 nova_compute[187078]: 2025-11-24 13:28:59.805 187082 DEBUG nova.compute.manager [req-a601d8f7-458a-4add-93a3-72dae437d8b9 req-4f2b8b77-5529-4393-928e-bfff474807ab 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:28:59 compute-1 nova_compute[187078]: 2025-11-24 13:28:59.805 187082 DEBUG oslo_concurrency.lockutils [req-a601d8f7-458a-4add-93a3-72dae437d8b9 req-4f2b8b77-5529-4393-928e-bfff474807ab 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:28:59 compute-1 nova_compute[187078]: 2025-11-24 13:28:59.805 187082 DEBUG oslo_concurrency.lockutils [req-a601d8f7-458a-4add-93a3-72dae437d8b9 req-4f2b8b77-5529-4393-928e-bfff474807ab 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:28:59 compute-1 nova_compute[187078]: 2025-11-24 13:28:59.806 187082 DEBUG oslo_concurrency.lockutils [req-a601d8f7-458a-4add-93a3-72dae437d8b9 req-4f2b8b77-5529-4393-928e-bfff474807ab 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:28:59 compute-1 nova_compute[187078]: 2025-11-24 13:28:59.806 187082 DEBUG nova.compute.manager [req-a601d8f7-458a-4add-93a3-72dae437d8b9 req-4f2b8b77-5529-4393-928e-bfff474807ab 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:28:59 compute-1 nova_compute[187078]: 2025-11-24 13:28:59.806 187082 WARNING nova.compute.manager [req-a601d8f7-458a-4add-93a3-72dae437d8b9 req-4f2b8b77-5529-4393-928e-bfff474807ab 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received unexpected event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with vm_state active and task_state None.
Nov 24 13:29:01 compute-1 nova_compute[187078]: 2025-11-24 13:29:01.429 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:03 compute-1 nova_compute[187078]: 2025-11-24 13:29:03.018 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:03 compute-1 sshd-session[213184]: Invalid user appuser from 68.183.82.237 port 48106
Nov 24 13:29:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:29:04.158 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:29:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:29:04.160 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:29:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:29:04.161 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:29:04 compute-1 sshd-session[213184]: Received disconnect from 68.183.82.237 port 48106:11: Bye Bye [preauth]
Nov 24 13:29:04 compute-1 sshd-session[213184]: Disconnected from invalid user appuser 68.183.82.237 port 48106 [preauth]
Nov 24 13:29:05 compute-1 nova_compute[187078]: 2025-11-24 13:29:05.301 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:29:05.302 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:29:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:29:05.304 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:29:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:29:05.305 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:29:05 compute-1 podman[197429]: time="2025-11-24T13:29:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:29:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:29:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:29:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:29:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Nov 24 13:29:05 compute-1 sshd-session[213186]: Invalid user saas from 176.114.89.34 port 44822
Nov 24 13:29:06 compute-1 sshd-session[213186]: Received disconnect from 176.114.89.34 port 44822:11: Bye Bye [preauth]
Nov 24 13:29:06 compute-1 sshd-session[213186]: Disconnected from invalid user saas 176.114.89.34 port 44822 [preauth]
Nov 24 13:29:06 compute-1 nova_compute[187078]: 2025-11-24 13:29:06.436 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:06 compute-1 nova_compute[187078]: 2025-11-24 13:29:06.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:07 compute-1 podman[213188]: 2025-11-24 13:29:07.518573436 +0000 UTC m=+0.056930711 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:29:07 compute-1 podman[213189]: 2025-11-24 13:29:07.520079716 +0000 UTC m=+0.055647127 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:29:08 compute-1 nova_compute[187078]: 2025-11-24 13:29:08.021 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:10 compute-1 nova_compute[187078]: 2025-11-24 13:29:10.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:11 compute-1 nova_compute[187078]: 2025-11-24 13:29:11.440 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:12 compute-1 sshd-session[213232]: Invalid user sol from 45.148.10.240 port 38456
Nov 24 13:29:12 compute-1 sshd-session[213232]: Connection closed by invalid user sol 45.148.10.240 port 38456 [preauth]
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.669 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.694 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.694 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.780 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.875 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.877 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:29:12 compute-1 nova_compute[187078]: 2025-11-24 13:29:12.969 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.024 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.205 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.207 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5695MB free_disk=73.43220138549805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.208 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.208 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.303 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 2975848d-b193-4147-9775-3861b433ffd1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.304 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.305 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.329 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.351 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.351 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.372 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.402 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.447 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.464 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.490 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:29:13 compute-1 nova_compute[187078]: 2025-11-24 13:29:13.491 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:29:14 compute-1 ovn_controller[95368]: 2025-11-24T13:29:14Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:37:50:20 10.100.0.12
Nov 24 13:29:14 compute-1 ovn_controller[95368]: 2025-11-24T13:29:14Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:37:50:20 10.100.0.12
Nov 24 13:29:14 compute-1 nova_compute[187078]: 2025-11-24 13:29:14.491 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:14 compute-1 nova_compute[187078]: 2025-11-24 13:29:14.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:14 compute-1 nova_compute[187078]: 2025-11-24 13:29:14.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:29:14 compute-1 nova_compute[187078]: 2025-11-24 13:29:14.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:29:14 compute-1 nova_compute[187078]: 2025-11-24 13:29:14.805 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:29:14 compute-1 nova_compute[187078]: 2025-11-24 13:29:14.805 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:29:14 compute-1 nova_compute[187078]: 2025-11-24 13:29:14.806 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:29:14 compute-1 nova_compute[187078]: 2025-11-24 13:29:14.806 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2975848d-b193-4147-9775-3861b433ffd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:29:15 compute-1 podman[213263]: 2025-11-24 13:29:15.563896804 +0000 UTC m=+0.084280420 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 24 13:29:15 compute-1 podman[213264]: 2025-11-24 13:29:15.566458122 +0000 UTC m=+0.092133720 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 24 13:29:15 compute-1 nova_compute[187078]: 2025-11-24 13:29:15.648 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updating instance_info_cache with network_info: [{"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:29:15 compute-1 nova_compute[187078]: 2025-11-24 13:29:15.663 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:29:15 compute-1 nova_compute[187078]: 2025-11-24 13:29:15.663 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:29:15 compute-1 nova_compute[187078]: 2025-11-24 13:29:15.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:15 compute-1 nova_compute[187078]: 2025-11-24 13:29:15.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:15 compute-1 nova_compute[187078]: 2025-11-24 13:29:15.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:29:16 compute-1 nova_compute[187078]: 2025-11-24 13:29:16.446 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:16 compute-1 nova_compute[187078]: 2025-11-24 13:29:16.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:29:18 compute-1 nova_compute[187078]: 2025-11-24 13:29:18.027 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:19 compute-1 openstack_network_exporter[199599]: ERROR   13:29:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:29:19 compute-1 openstack_network_exporter[199599]: ERROR   13:29:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:29:19 compute-1 openstack_network_exporter[199599]: ERROR   13:29:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:29:19 compute-1 openstack_network_exporter[199599]: ERROR   13:29:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:29:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:29:19 compute-1 openstack_network_exporter[199599]: ERROR   13:29:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:29:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:29:21 compute-1 nova_compute[187078]: 2025-11-24 13:29:21.448 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:23 compute-1 nova_compute[187078]: 2025-11-24 13:29:23.030 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:23 compute-1 podman[213310]: 2025-11-24 13:29:23.54886925 +0000 UTC m=+0.083971603 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 24 13:29:25 compute-1 sshd-session[213308]: Invalid user server from 45.78.194.40 port 41654
Nov 24 13:29:25 compute-1 sshd-session[213308]: Received disconnect from 45.78.194.40 port 41654:11: Bye Bye [preauth]
Nov 24 13:29:25 compute-1 sshd-session[213308]: Disconnected from invalid user server 45.78.194.40 port 41654 [preauth]
Nov 24 13:29:26 compute-1 nova_compute[187078]: 2025-11-24 13:29:26.453 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:28 compute-1 nova_compute[187078]: 2025-11-24 13:29:28.074 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:31 compute-1 nova_compute[187078]: 2025-11-24 13:29:31.458 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:33 compute-1 nova_compute[187078]: 2025-11-24 13:29:33.075 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:33 compute-1 sshd-session[213331]: Invalid user alma from 175.100.24.139 port 32782
Nov 24 13:29:33 compute-1 sshd-session[213331]: Received disconnect from 175.100.24.139 port 32782:11: Bye Bye [preauth]
Nov 24 13:29:33 compute-1 sshd-session[213331]: Disconnected from invalid user alma 175.100.24.139 port 32782 [preauth]
Nov 24 13:29:35 compute-1 podman[197429]: time="2025-11-24T13:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:29:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:29:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:29:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Nov 24 13:29:36 compute-1 ovn_controller[95368]: 2025-11-24T13:29:36Z|00129|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Nov 24 13:29:36 compute-1 nova_compute[187078]: 2025-11-24 13:29:36.462 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:37 compute-1 sshd-session[213334]: Received disconnect from 85.209.134.43 port 55724:11: Bye Bye [preauth]
Nov 24 13:29:37 compute-1 sshd-session[213334]: Disconnected from authenticating user root 85.209.134.43 port 55724 [preauth]
Nov 24 13:29:38 compute-1 nova_compute[187078]: 2025-11-24 13:29:38.077 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:38 compute-1 podman[213337]: 2025-11-24 13:29:38.518671189 +0000 UTC m=+0.058949855 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 24 13:29:38 compute-1 podman[213336]: 2025-11-24 13:29:38.534986034 +0000 UTC m=+0.081076935 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:29:41 compute-1 nova_compute[187078]: 2025-11-24 13:29:41.466 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:43 compute-1 nova_compute[187078]: 2025-11-24 13:29:43.079 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:46 compute-1 nova_compute[187078]: 2025-11-24 13:29:46.469 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:46 compute-1 podman[213379]: 2025-11-24 13:29:46.558946082 +0000 UTC m=+0.086747356 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 24 13:29:46 compute-1 podman[213380]: 2025-11-24 13:29:46.594332926 +0000 UTC m=+0.116164691 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:29:48 compute-1 nova_compute[187078]: 2025-11-24 13:29:48.081 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:49 compute-1 openstack_network_exporter[199599]: ERROR   13:29:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:29:49 compute-1 openstack_network_exporter[199599]: ERROR   13:29:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:29:49 compute-1 openstack_network_exporter[199599]: ERROR   13:29:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:29:49 compute-1 openstack_network_exporter[199599]: ERROR   13:29:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:29:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:29:49 compute-1 openstack_network_exporter[199599]: ERROR   13:29:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:29:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:29:50 compute-1 sshd-session[213422]: Invalid user liyang from 5.198.176.28 port 44552
Nov 24 13:29:50 compute-1 sshd-session[213422]: Received disconnect from 5.198.176.28 port 44552:11: Bye Bye [preauth]
Nov 24 13:29:50 compute-1 sshd-session[213422]: Disconnected from invalid user liyang 5.198.176.28 port 44552 [preauth]
Nov 24 13:29:51 compute-1 nova_compute[187078]: 2025-11-24 13:29:51.250 187082 DEBUG nova.compute.manager [None req-ade35015-a558-4378-b270-976c66991f83 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Nov 24 13:29:51 compute-1 nova_compute[187078]: 2025-11-24 13:29:51.294 187082 DEBUG nova.compute.provider_tree [None req-ade35015-a558-4378-b270-976c66991f83 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 25 to 26 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:29:51 compute-1 nova_compute[187078]: 2025-11-24 13:29:51.473 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:53 compute-1 nova_compute[187078]: 2025-11-24 13:29:53.084 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:54 compute-1 podman[213424]: 2025-11-24 13:29:54.546122028 +0000 UTC m=+0.076473806 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter)
Nov 24 13:29:54 compute-1 nova_compute[187078]: 2025-11-24 13:29:54.905 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Check if temp file /var/lib/nova/instances/tmply1zg9so exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 24 13:29:54 compute-1 nova_compute[187078]: 2025-11-24 13:29:54.906 187082 DEBUG nova.compute.manager [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmply1zg9so',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2975848d-b193-4147-9775-3861b433ffd1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 24 13:29:56 compute-1 nova_compute[187078]: 2025-11-24 13:29:56.477 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:29:57 compute-1 nova_compute[187078]: 2025-11-24 13:29:57.422 187082 DEBUG oslo_concurrency.processutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:29:57 compute-1 nova_compute[187078]: 2025-11-24 13:29:57.517 187082 DEBUG oslo_concurrency.processutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:29:57 compute-1 nova_compute[187078]: 2025-11-24 13:29:57.519 187082 DEBUG oslo_concurrency.processutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:29:57 compute-1 nova_compute[187078]: 2025-11-24 13:29:57.578 187082 DEBUG oslo_concurrency.processutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:29:58 compute-1 nova_compute[187078]: 2025-11-24 13:29:58.088 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:00 compute-1 sshd-session[213451]: Accepted publickey for nova from 192.168.122.100 port 44718 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:30:00 compute-1 systemd-logind[815]: New session 39 of user nova.
Nov 24 13:30:00 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Nov 24 13:30:00 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 24 13:30:00 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 24 13:30:00 compute-1 systemd[1]: Starting User Manager for UID 42436...
Nov 24 13:30:00 compute-1 systemd[213455]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:30:00 compute-1 systemd[213455]: Queued start job for default target Main User Target.
Nov 24 13:30:00 compute-1 systemd[213455]: Created slice User Application Slice.
Nov 24 13:30:00 compute-1 systemd[213455]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:30:00 compute-1 systemd[213455]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:30:00 compute-1 systemd[213455]: Reached target Paths.
Nov 24 13:30:00 compute-1 systemd[213455]: Reached target Timers.
Nov 24 13:30:00 compute-1 systemd[213455]: Starting D-Bus User Message Bus Socket...
Nov 24 13:30:00 compute-1 systemd[213455]: Starting Create User's Volatile Files and Directories...
Nov 24 13:30:00 compute-1 systemd[213455]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:30:00 compute-1 systemd[213455]: Reached target Sockets.
Nov 24 13:30:00 compute-1 systemd[213455]: Finished Create User's Volatile Files and Directories.
Nov 24 13:30:00 compute-1 systemd[213455]: Reached target Basic System.
Nov 24 13:30:00 compute-1 systemd[213455]: Reached target Main User Target.
Nov 24 13:30:00 compute-1 systemd[213455]: Startup finished in 159ms.
Nov 24 13:30:00 compute-1 systemd[1]: Started User Manager for UID 42436.
Nov 24 13:30:00 compute-1 systemd[1]: Started Session 39 of User nova.
Nov 24 13:30:00 compute-1 sshd-session[213451]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:30:00 compute-1 sshd-session[213470]: Received disconnect from 192.168.122.100 port 44718:11: disconnected by user
Nov 24 13:30:00 compute-1 sshd-session[213470]: Disconnected from user nova 192.168.122.100 port 44718
Nov 24 13:30:00 compute-1 sshd-session[213451]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:30:00 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Nov 24 13:30:00 compute-1 systemd-logind[815]: Session 39 logged out. Waiting for processes to exit.
Nov 24 13:30:00 compute-1 systemd-logind[815]: Removed session 39.
Nov 24 13:30:01 compute-1 anacron[94740]: Job `cron.daily' started
Nov 24 13:30:01 compute-1 anacron[94740]: Job `cron.daily' terminated
Nov 24 13:30:01 compute-1 nova_compute[187078]: 2025-11-24 13:30:01.482 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:02 compute-1 nova_compute[187078]: 2025-11-24 13:30:02.462 187082 DEBUG nova.compute.manager [req-a347b474-9dce-4f7f-ae77-38dd8f7da000 req-8715c6c7-a5de-4ede-8b2c-89695779bc52 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:02 compute-1 nova_compute[187078]: 2025-11-24 13:30:02.463 187082 DEBUG oslo_concurrency.lockutils [req-a347b474-9dce-4f7f-ae77-38dd8f7da000 req-8715c6c7-a5de-4ede-8b2c-89695779bc52 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:02 compute-1 nova_compute[187078]: 2025-11-24 13:30:02.463 187082 DEBUG oslo_concurrency.lockutils [req-a347b474-9dce-4f7f-ae77-38dd8f7da000 req-8715c6c7-a5de-4ede-8b2c-89695779bc52 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:02 compute-1 nova_compute[187078]: 2025-11-24 13:30:02.464 187082 DEBUG oslo_concurrency.lockutils [req-a347b474-9dce-4f7f-ae77-38dd8f7da000 req-8715c6c7-a5de-4ede-8b2c-89695779bc52 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:02 compute-1 nova_compute[187078]: 2025-11-24 13:30:02.464 187082 DEBUG nova.compute.manager [req-a347b474-9dce-4f7f-ae77-38dd8f7da000 req-8715c6c7-a5de-4ede-8b2c-89695779bc52 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:02 compute-1 nova_compute[187078]: 2025-11-24 13:30:02.464 187082 DEBUG nova.compute.manager [req-a347b474-9dce-4f7f-ae77-38dd8f7da000 req-8715c6c7-a5de-4ede-8b2c-89695779bc52 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:30:03 compute-1 nova_compute[187078]: 2025-11-24 13:30:03.091 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:04.157 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:04.158 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:04.159 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:04 compute-1 nova_compute[187078]: 2025-11-24 13:30:04.641 187082 DEBUG nova.compute.manager [req-7bff6e15-a835-4626-adcd-67f0cefccf2c req-fb549a19-4e95-491c-a4bc-dfce70924942 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:04 compute-1 nova_compute[187078]: 2025-11-24 13:30:04.641 187082 DEBUG oslo_concurrency.lockutils [req-7bff6e15-a835-4626-adcd-67f0cefccf2c req-fb549a19-4e95-491c-a4bc-dfce70924942 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:04 compute-1 nova_compute[187078]: 2025-11-24 13:30:04.641 187082 DEBUG oslo_concurrency.lockutils [req-7bff6e15-a835-4626-adcd-67f0cefccf2c req-fb549a19-4e95-491c-a4bc-dfce70924942 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:04 compute-1 nova_compute[187078]: 2025-11-24 13:30:04.642 187082 DEBUG oslo_concurrency.lockutils [req-7bff6e15-a835-4626-adcd-67f0cefccf2c req-fb549a19-4e95-491c-a4bc-dfce70924942 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:04 compute-1 nova_compute[187078]: 2025-11-24 13:30:04.642 187082 DEBUG nova.compute.manager [req-7bff6e15-a835-4626-adcd-67f0cefccf2c req-fb549a19-4e95-491c-a4bc-dfce70924942 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:04 compute-1 nova_compute[187078]: 2025-11-24 13:30:04.642 187082 WARNING nova.compute.manager [req-7bff6e15-a835-4626-adcd-67f0cefccf2c req-fb549a19-4e95-491c-a4bc-dfce70924942 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received unexpected event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with vm_state active and task_state migrating.
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.103 187082 INFO nova.compute.manager [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Took 7.52 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.104 187082 DEBUG nova.compute.manager [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.119 187082 DEBUG nova.compute.manager [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmply1zg9so',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2975848d-b193-4147-9775-3861b433ffd1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1b5db853-85e9-4510-9127-18de5b5db0a8),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.148 187082 DEBUG nova.objects.instance [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid 2975848d-b193-4147-9775-3861b433ffd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.149 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.152 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.153 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.165 187082 DEBUG nova.virt.libvirt.vif [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-376350173',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-376350173',id=13,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:28:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-4dofsm6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:28:58Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=2975848d-b193-4147-9775-3861b433ffd1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.166 187082 DEBUG nova.network.os_vif_util [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.166 187082 DEBUG nova.network.os_vif_util [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:50:20,bridge_name='br-int',has_traffic_filtering=True,id=b67799d4-0548-47ba-a2a4-2f8dd8402dd6,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb67799d4-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.167 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updating guest XML with vif config: <interface type="ethernet">
Nov 24 13:30:05 compute-1 nova_compute[187078]:   <mac address="fa:16:3e:37:50:20"/>
Nov 24 13:30:05 compute-1 nova_compute[187078]:   <model type="virtio"/>
Nov 24 13:30:05 compute-1 nova_compute[187078]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:30:05 compute-1 nova_compute[187078]:   <mtu size="1442"/>
Nov 24 13:30:05 compute-1 nova_compute[187078]:   <target dev="tapb67799d4-05"/>
Nov 24 13:30:05 compute-1 nova_compute[187078]: </interface>
Nov 24 13:30:05 compute-1 nova_compute[187078]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.168 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 24 13:30:05 compute-1 podman[197429]: time="2025-11-24T13:30:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:30:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:30:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.655 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.656 187082 INFO nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 24 13:30:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:30:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3050 "" "Go-http-client/1.1"
Nov 24 13:30:05 compute-1 nova_compute[187078]: 2025-11-24 13:30:05.708 187082 INFO nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.211 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.211 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.487 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.714 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.715 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.755 187082 DEBUG nova.compute.manager [req-ee2892f4-fd69-4f66-903b-137853c08fa6 req-26025f0d-731c-4b0e-b234-743abd9bfa9f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-changed-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.755 187082 DEBUG nova.compute.manager [req-ee2892f4-fd69-4f66-903b-137853c08fa6 req-26025f0d-731c-4b0e-b234-743abd9bfa9f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Refreshing instance network info cache due to event network-changed-b67799d4-0548-47ba-a2a4-2f8dd8402dd6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.756 187082 DEBUG oslo_concurrency.lockutils [req-ee2892f4-fd69-4f66-903b-137853c08fa6 req-26025f0d-731c-4b0e-b234-743abd9bfa9f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.756 187082 DEBUG oslo_concurrency.lockutils [req-ee2892f4-fd69-4f66-903b-137853c08fa6 req-26025f0d-731c-4b0e-b234-743abd9bfa9f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:30:06 compute-1 nova_compute[187078]: 2025-11-24 13:30:06.756 187082 DEBUG nova.network.neutron [req-ee2892f4-fd69-4f66-903b-137853c08fa6 req-26025f0d-731c-4b0e-b234-743abd9bfa9f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Refreshing network info cache for port b67799d4-0548-47ba-a2a4-2f8dd8402dd6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.277 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.278 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.782 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.782 187082 DEBUG nova.virt.libvirt.migration [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.832 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991007.8312438, 2975848d-b193-4147-9775-3861b433ffd1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.832 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] VM Paused (Lifecycle Event)
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.857 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.862 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:30:07 compute-1 nova_compute[187078]: 2025-11-24 13:30:07.877 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 24 13:30:07 compute-1 kernel: tapb67799d4-05 (unregistering): left promiscuous mode
Nov 24 13:30:08 compute-1 NetworkManager[55527]: <info>  [1763991008.0192] device (tapb67799d4-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:30:08 compute-1 ovn_controller[95368]: 2025-11-24T13:30:08Z|00130|binding|INFO|Releasing lport b67799d4-0548-47ba-a2a4-2f8dd8402dd6 from this chassis (sb_readonly=0)
Nov 24 13:30:08 compute-1 ovn_controller[95368]: 2025-11-24T13:30:08Z|00131|binding|INFO|Setting lport b67799d4-0548-47ba-a2a4-2f8dd8402dd6 down in Southbound
Nov 24 13:30:08 compute-1 ovn_controller[95368]: 2025-11-24T13:30:08Z|00132|binding|INFO|Removing iface tapb67799d4-05 ovn-installed in OVS
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.029 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.043 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.046 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:50:20 10.100.0.12'], port_security=['fa:16:3e:37:50:20 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2975848d-b193-4147-9775-3861b433ffd1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=b67799d4-0548-47ba-a2a4-2f8dd8402dd6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.047 104225 INFO neutron.agent.ovn.metadata.agent [-] Port b67799d4-0548-47ba-a2a4-2f8dd8402dd6 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.048 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.051 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0c9853-783a-4212-b038-2a92365f3a27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.053 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace which is not needed anymore
Nov 24 13:30:08 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 24 13:30:08 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Consumed 18.179s CPU time.
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.093 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:08 compute-1 systemd-machined[153355]: Machine qemu-10-instance-0000000d terminated.
Nov 24 13:30:08 compute-1 NetworkManager[55527]: <info>  [1763991008.1745] manager: (tapb67799d4-05): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.176 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.181 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:08 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213169]: [NOTICE]   (213173) : haproxy version is 2.8.14-c23fe91
Nov 24 13:30:08 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213169]: [NOTICE]   (213173) : path to executable is /usr/sbin/haproxy
Nov 24 13:30:08 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213169]: [WARNING]  (213173) : Exiting Master process...
Nov 24 13:30:08 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213169]: [ALERT]    (213173) : Current worker (213175) exited with code 143 (Terminated)
Nov 24 13:30:08 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213169]: [WARNING]  (213173) : All workers exited. Exiting... (0)
Nov 24 13:30:08 compute-1 systemd[1]: libpod-c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682.scope: Deactivated successfully.
Nov 24 13:30:08 compute-1 podman[213512]: 2025-11-24 13:30:08.201181527 +0000 UTC m=+0.051237604 container died c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.221 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.222 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.222 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 24 13:30:08 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682-userdata-shm.mount: Deactivated successfully.
Nov 24 13:30:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-7c29a0b0802e9dcd9e56e938e89673c5eb582a1bd0be5a36bbb83261099a5e5b-merged.mount: Deactivated successfully.
Nov 24 13:30:08 compute-1 podman[213512]: 2025-11-24 13:30:08.240782816 +0000 UTC m=+0.090838893 container cleanup c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 24 13:30:08 compute-1 systemd[1]: libpod-conmon-c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682.scope: Deactivated successfully.
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.285 187082 DEBUG nova.virt.libvirt.guest [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '2975848d-b193-4147-9775-3861b433ffd1' (instance-0000000d) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.285 187082 INFO nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Migration operation has completed
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.286 187082 INFO nova.compute.manager [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] _post_live_migration() is started..
Nov 24 13:30:08 compute-1 podman[213557]: 2025-11-24 13:30:08.302354791 +0000 UTC m=+0.040347390 container remove c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.310 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[fa23cbc6-c885-45e6-9b0e-729aef6602e6]: (4, ('Mon Nov 24 01:30:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682)\nc5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682\nMon Nov 24 01:30:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (c5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682)\nc5f19039af4bc74239dfef111b7eff9382fc002c611af5e9ddda728770be4682\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.312 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[7fbd354e-3002-4711-a7d4-2930f422b59d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.314 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:08 compute-1 kernel: tapee6bf4e1-a0: left promiscuous mode
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.316 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.319 187082 DEBUG nova.network.neutron [req-ee2892f4-fd69-4f66-903b-137853c08fa6 req-26025f0d-731c-4b0e-b234-743abd9bfa9f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updated VIF entry in instance network info cache for port b67799d4-0548-47ba-a2a4-2f8dd8402dd6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.319 187082 DEBUG nova.network.neutron [req-ee2892f4-fd69-4f66-903b-137853c08fa6 req-26025f0d-731c-4b0e-b234-743abd9bfa9f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updating instance_info_cache with network_info: [{"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.331 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.334 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8f830c-1746-47c4-8cfa-5d76a3153832]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.339 187082 DEBUG oslo_concurrency.lockutils [req-ee2892f4-fd69-4f66-903b-137853c08fa6 req-26025f0d-731c-4b0e-b234-743abd9bfa9f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-2975848d-b193-4147-9775-3861b433ffd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.349 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[7c610148-30c4-43a3-b9ef-f8e939b39f1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.350 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9d89dbcc-b51f-49f8-9842-4ead607c338a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.365 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc72fe7-3d50-4d39-a68f-4c1f33767699]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395629, 'reachable_time': 20519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213575, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.369 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.369 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[a14d547a-7b21-485e-b0ac-de13a7f9cea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:08 compute-1 systemd[1]: run-netns-ovnmeta\x2dee6bf4e1\x2dadcd\x2d4f6c\x2d8b46\x2deaa71e64e9c0.mount: Deactivated successfully.
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.684 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.763 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.764 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:08.764 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.859 187082 DEBUG nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.859 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.859 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.859 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.860 187082 DEBUG nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.860 187082 DEBUG nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.860 187082 DEBUG nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.860 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.860 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.860 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.861 187082 DEBUG nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.861 187082 WARNING nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received unexpected event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with vm_state active and task_state migrating.
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.861 187082 DEBUG nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.861 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.861 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.862 187082 DEBUG oslo_concurrency.lockutils [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.862 187082 DEBUG nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:08 compute-1 nova_compute[187078]: 2025-11-24 13:30:08.862 187082 WARNING nova.compute.manager [req-b0599dd5-a0cb-4e8b-8222-5abdccd5fa5c req-4b38802d-d0a7-414d-a186-f028c415944e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received unexpected event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with vm_state active and task_state migrating.
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.010 187082 DEBUG nova.network.neutron [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Activated binding for port b67799d4-0548-47ba-a2a4-2f8dd8402dd6 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.011 187082 DEBUG nova.compute.manager [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.011 187082 DEBUG nova.virt.libvirt.vif [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:28:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-376350173',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-376350173',id=13,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:28:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-4dofsm6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:29:53Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=2975848d-b193-4147-9775-3861b433ffd1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.012 187082 DEBUG nova.network.os_vif_util [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "address": "fa:16:3e:37:50:20", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb67799d4-05", "ovs_interfaceid": "b67799d4-0548-47ba-a2a4-2f8dd8402dd6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.012 187082 DEBUG nova.network.os_vif_util [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:50:20,bridge_name='br-int',has_traffic_filtering=True,id=b67799d4-0548-47ba-a2a4-2f8dd8402dd6,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb67799d4-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.012 187082 DEBUG os_vif [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:50:20,bridge_name='br-int',has_traffic_filtering=True,id=b67799d4-0548-47ba-a2a4-2f8dd8402dd6,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb67799d4-05') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.014 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.014 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb67799d4-05, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.015 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.016 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.022 187082 INFO os_vif [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:50:20,bridge_name='br-int',has_traffic_filtering=True,id=b67799d4-0548-47ba-a2a4-2f8dd8402dd6,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb67799d4-05')
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.023 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.023 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.023 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.023 187082 DEBUG nova.compute.manager [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.023 187082 INFO nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Deleting instance files /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1_del
Nov 24 13:30:09 compute-1 nova_compute[187078]: 2025-11-24 13:30:09.024 187082 INFO nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Deletion of /var/lib/nova/instances/2975848d-b193-4147-9775-3861b433ffd1_del complete
Nov 24 13:30:09 compute-1 podman[213576]: 2025-11-24 13:30:09.50973634 +0000 UTC m=+0.054049459 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:30:09 compute-1 podman[213577]: 2025-11-24 13:30:09.540352731 +0000 UTC m=+0.078008625 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:10 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Nov 24 13:30:10 compute-1 systemd[213455]: Activating special unit Exit the Session...
Nov 24 13:30:10 compute-1 systemd[213455]: Stopped target Main User Target.
Nov 24 13:30:10 compute-1 systemd[213455]: Stopped target Basic System.
Nov 24 13:30:10 compute-1 systemd[213455]: Stopped target Paths.
Nov 24 13:30:10 compute-1 systemd[213455]: Stopped target Sockets.
Nov 24 13:30:10 compute-1 systemd[213455]: Stopped target Timers.
Nov 24 13:30:10 compute-1 systemd[213455]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:30:10 compute-1 systemd[213455]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:30:10 compute-1 systemd[213455]: Closed D-Bus User Message Bus Socket.
Nov 24 13:30:10 compute-1 systemd[213455]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:30:10 compute-1 systemd[213455]: Removed slice User Application Slice.
Nov 24 13:30:10 compute-1 systemd[213455]: Reached target Shutdown.
Nov 24 13:30:10 compute-1 systemd[213455]: Finished Exit the Session.
Nov 24 13:30:10 compute-1 systemd[213455]: Reached target Exit the Session.
Nov 24 13:30:10 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Nov 24 13:30:10 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Nov 24 13:30:10 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.964 187082 DEBUG nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.965 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.965 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.965 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.965 187082 DEBUG nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.966 187082 DEBUG nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-unplugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.966 187082 DEBUG nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.966 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.966 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.967 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.967 187082 DEBUG nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.967 187082 WARNING nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received unexpected event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with vm_state active and task_state migrating.
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.967 187082 DEBUG nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.967 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.968 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.968 187082 DEBUG oslo_concurrency.lockutils [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.968 187082 DEBUG nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] No waiting events found dispatching network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:10 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 24 13:30:10 compute-1 nova_compute[187078]: 2025-11-24 13:30:10.968 187082 WARNING nova.compute.manager [req-7a83e6c8-6479-409b-9f0b-e5d232f959ec req-c111658b-3006-4911-b194-7569b02764b3 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Received unexpected event network-vif-plugged-b67799d4-0548-47ba-a2a4-2f8dd8402dd6 for instance with vm_state active and task_state migrating.
Nov 24 13:30:10 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 24 13:30:10 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 24 13:30:10 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.687 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.688 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.870 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.871 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5881MB free_disk=73.45995712280273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.871 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.871 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:12 compute-1 nova_compute[187078]: 2025-11-24 13:30:12.916 187082 INFO nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Updating resource usage from migration 1b5db853-85e9-4510-9127-18de5b5db0a8
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.034 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Migration 1b5db853-85e9-4510-9127-18de5b5db0a8 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.034 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.035 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.096 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.148 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.160 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.229 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.229 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.632 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "2975848d-b193-4147-9775-3861b433ffd1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.633 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.634 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "2975848d-b193-4147-9775-3861b433ffd1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.657 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.658 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.658 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.659 187082 DEBUG nova.compute.resource_tracker [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:30:13 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:13.767 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.823 187082 WARNING nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.825 187082 DEBUG nova.compute.resource_tracker [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5882MB free_disk=73.45997619628906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.825 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.825 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.858 187082 DEBUG nova.compute.resource_tracker [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration for instance 2975848d-b193-4147-9775-3861b433ffd1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.872 187082 DEBUG nova.compute.resource_tracker [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.890 187082 DEBUG nova.compute.resource_tracker [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration 1b5db853-85e9-4510-9127-18de5b5db0a8 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.890 187082 DEBUG nova.compute.resource_tracker [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.891 187082 DEBUG nova.compute.resource_tracker [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.932 187082 DEBUG nova.compute.provider_tree [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:30:13 compute-1 nova_compute[187078]: 2025-11-24 13:30:13.946 187082 DEBUG nova.scheduler.client.report [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.011 187082 DEBUG nova.compute.resource_tracker [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.012 187082 DEBUG oslo_concurrency.lockutils [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.016 187082 INFO nova.compute.manager [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.017 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.157 187082 INFO nova.scheduler.client.report [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration 1b5db853-85e9-4510-9127-18de5b5db0a8
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.157 187082 DEBUG nova.virt.libvirt.driver [None req-5c792d60-7c94-4c95-ba7d-a497bc256cf7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 24 13:30:14 compute-1 sshd-session[213622]: Invalid user vpnuser from 176.114.89.34 port 38932
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.989 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.990 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:30:14 compute-1 nova_compute[187078]: 2025-11-24 13:30:14.990 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:30:15 compute-1 nova_compute[187078]: 2025-11-24 13:30:15.004 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:30:15 compute-1 nova_compute[187078]: 2025-11-24 13:30:15.005 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:15 compute-1 nova_compute[187078]: 2025-11-24 13:30:15.005 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:15 compute-1 sshd-session[213622]: Received disconnect from 176.114.89.34 port 38932:11: Bye Bye [preauth]
Nov 24 13:30:15 compute-1 sshd-session[213622]: Disconnected from invalid user vpnuser 176.114.89.34 port 38932 [preauth]
Nov 24 13:30:15 compute-1 nova_compute[187078]: 2025-11-24 13:30:15.676 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:17 compute-1 podman[213624]: 2025-11-24 13:30:17.532176456 +0000 UTC m=+0.063614549 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 13:30:17 compute-1 podman[213625]: 2025-11-24 13:30:17.566922427 +0000 UTC m=+0.083542092 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 24 13:30:17 compute-1 nova_compute[187078]: 2025-11-24 13:30:17.661 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:17 compute-1 nova_compute[187078]: 2025-11-24 13:30:17.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:17 compute-1 nova_compute[187078]: 2025-11-24 13:30:17.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:30:18 compute-1 nova_compute[187078]: 2025-11-24 13:30:18.118 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:19 compute-1 nova_compute[187078]: 2025-11-24 13:30:19.019 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:19 compute-1 openstack_network_exporter[199599]: ERROR   13:30:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:30:19 compute-1 openstack_network_exporter[199599]: ERROR   13:30:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:30:19 compute-1 openstack_network_exporter[199599]: ERROR   13:30:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:30:19 compute-1 openstack_network_exporter[199599]: ERROR   13:30:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:30:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:30:19 compute-1 openstack_network_exporter[199599]: ERROR   13:30:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:30:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:30:19 compute-1 nova_compute[187078]: 2025-11-24 13:30:19.661 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:22 compute-1 sshd-session[213669]: Invalid user postgres from 68.183.82.237 port 55116
Nov 24 13:30:23 compute-1 nova_compute[187078]: 2025-11-24 13:30:23.120 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:23 compute-1 sshd-session[213669]: Received disconnect from 68.183.82.237 port 55116:11: Bye Bye [preauth]
Nov 24 13:30:23 compute-1 sshd-session[213669]: Disconnected from invalid user postgres 68.183.82.237 port 55116 [preauth]
Nov 24 13:30:23 compute-1 nova_compute[187078]: 2025-11-24 13:30:23.219 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991008.2187285, 2975848d-b193-4147-9775-3861b433ffd1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:30:23 compute-1 nova_compute[187078]: 2025-11-24 13:30:23.219 187082 INFO nova.compute.manager [-] [instance: 2975848d-b193-4147-9775-3861b433ffd1] VM Stopped (Lifecycle Event)
Nov 24 13:30:23 compute-1 nova_compute[187078]: 2025-11-24 13:30:23.241 187082 DEBUG nova.compute.manager [None req-99cb5538-593c-45b5-93ee-9bb8c49e6cd4 - - - - - -] [instance: 2975848d-b193-4147-9775-3861b433ffd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:30:24 compute-1 nova_compute[187078]: 2025-11-24 13:30:24.021 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:25 compute-1 podman[213673]: 2025-11-24 13:30:25.532784201 +0000 UTC m=+0.076044465 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6)
Nov 24 13:30:25 compute-1 sshd-session[213671]: Invalid user brain from 193.32.162.145 port 40174
Nov 24 13:30:26 compute-1 sshd-session[213671]: Connection closed by invalid user brain 193.32.162.145 port 40174 [preauth]
Nov 24 13:30:26 compute-1 nova_compute[187078]: 2025-11-24 13:30:26.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:26 compute-1 nova_compute[187078]: 2025-11-24 13:30:26.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:30:28 compute-1 nova_compute[187078]: 2025-11-24 13:30:28.123 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:29 compute-1 nova_compute[187078]: 2025-11-24 13:30:29.023 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:31 compute-1 nova_compute[187078]: 2025-11-24 13:30:31.679 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:30:31 compute-1 nova_compute[187078]: 2025-11-24 13:30:31.679 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:30:31 compute-1 nova_compute[187078]: 2025-11-24 13:30:31.695 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:30:33 compute-1 nova_compute[187078]: 2025-11-24 13:30:33.179 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:34 compute-1 nova_compute[187078]: 2025-11-24 13:30:34.025 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:35 compute-1 podman[197429]: time="2025-11-24T13:30:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:30:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:30:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:30:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:30:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Nov 24 13:30:36 compute-1 sshd-session[213695]: Invalid user rahul from 85.209.134.43 port 60932
Nov 24 13:30:36 compute-1 sshd-session[213695]: Received disconnect from 85.209.134.43 port 60932:11: Bye Bye [preauth]
Nov 24 13:30:36 compute-1 sshd-session[213695]: Disconnected from invalid user rahul 85.209.134.43 port 60932 [preauth]
Nov 24 13:30:37 compute-1 nova_compute[187078]: 2025-11-24 13:30:37.232 187082 DEBUG nova.compute.manager [None req-39bb8696-5e47-43fb-bd5b-375b99889779 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Nov 24 13:30:37 compute-1 nova_compute[187078]: 2025-11-24 13:30:37.276 187082 DEBUG nova.compute.provider_tree [None req-39bb8696-5e47-43fb-bd5b-375b99889779 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 26 to 29 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:30:38 compute-1 nova_compute[187078]: 2025-11-24 13:30:38.180 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:39 compute-1 nova_compute[187078]: 2025-11-24 13:30:39.027 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:40 compute-1 podman[213697]: 2025-11-24 13:30:40.495025994 +0000 UTC m=+0.046410248 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:30:40 compute-1 podman[213698]: 2025-11-24 13:30:40.528762118 +0000 UTC m=+0.077253436 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true)
Nov 24 13:30:40 compute-1 nova_compute[187078]: 2025-11-24 13:30:40.839 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:40 compute-1 nova_compute[187078]: 2025-11-24 13:30:40.839 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:40 compute-1 nova_compute[187078]: 2025-11-24 13:30:40.866 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:30:40 compute-1 nova_compute[187078]: 2025-11-24 13:30:40.953 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:40 compute-1 nova_compute[187078]: 2025-11-24 13:30:40.954 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:40 compute-1 nova_compute[187078]: 2025-11-24 13:30:40.961 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:30:40 compute-1 nova_compute[187078]: 2025-11-24 13:30:40.962 187082 INFO nova.compute.claims [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.099 187082 DEBUG nova.compute.provider_tree [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.120 187082 DEBUG nova.scheduler.client.report [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.155 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.155 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.224 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.224 187082 DEBUG nova.network.neutron [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.261 187082 INFO nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.291 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.378 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.380 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.380 187082 INFO nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Creating image(s)
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.381 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "/var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.381 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.382 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.394 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.452 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.453 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.454 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.464 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.519 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.520 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.556 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.557 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.557 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.612 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.614 187082 DEBUG nova.virt.disk.api [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Checking if we can resize image /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.614 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.670 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.671 187082 DEBUG nova.virt.disk.api [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Cannot resize image /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.671 187082 DEBUG nova.objects.instance [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'migration_context' on Instance uuid 03dff342-d941-4d7e-9ada-afc46435fd14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.682 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.683 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Ensure instance console log exists: /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.683 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.683 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:41 compute-1 nova_compute[187078]: 2025-11-24 13:30:41.684 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:42 compute-1 nova_compute[187078]: 2025-11-24 13:30:42.375 187082 DEBUG nova.policy [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44609a4d2fa941a4b26d6b27a5d4a6d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a66bcdc071b741ef8709a4608acd6051', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:30:43 compute-1 nova_compute[187078]: 2025-11-24 13:30:43.182 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:43 compute-1 nova_compute[187078]: 2025-11-24 13:30:43.727 187082 DEBUG nova.network.neutron [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Successfully created port: 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.029 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.340 187082 DEBUG nova.network.neutron [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Successfully updated port: 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.356 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.356 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquired lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.356 187082 DEBUG nova.network.neutron [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.425 187082 DEBUG nova.compute.manager [req-dfe115d0-5557-40dd-92d5-9d60bff76975 req-36853924-5a8b-446b-839d-10c91590780e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received event network-changed-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.426 187082 DEBUG nova.compute.manager [req-dfe115d0-5557-40dd-92d5-9d60bff76975 req-36853924-5a8b-446b-839d-10c91590780e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Refreshing instance network info cache due to event network-changed-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.426 187082 DEBUG oslo_concurrency.lockutils [req-dfe115d0-5557-40dd-92d5-9d60bff76975 req-36853924-5a8b-446b-839d-10c91590780e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:30:44 compute-1 nova_compute[187078]: 2025-11-24 13:30:44.484 187082 DEBUG nova.network.neutron [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.354 187082 DEBUG nova.network.neutron [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Updating instance_info_cache with network_info: [{"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.375 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Releasing lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.376 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Instance network_info: |[{"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.377 187082 DEBUG oslo_concurrency.lockutils [req-dfe115d0-5557-40dd-92d5-9d60bff76975 req-36853924-5a8b-446b-839d-10c91590780e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.378 187082 DEBUG nova.network.neutron [req-dfe115d0-5557-40dd-92d5-9d60bff76975 req-36853924-5a8b-446b-839d-10c91590780e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Refreshing network info cache for port 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.387 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Start _get_guest_xml network_info=[{"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.397 187082 WARNING nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.403 187082 DEBUG nova.virt.libvirt.host [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.404 187082 DEBUG nova.virt.libvirt.host [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.406 187082 DEBUG nova.virt.libvirt.host [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.407 187082 DEBUG nova.virt.libvirt.host [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.408 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.408 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.408 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.408 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.409 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.409 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.409 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.409 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.409 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.410 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.410 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.410 187082 DEBUG nova.virt.hardware [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.413 187082 DEBUG nova.virt.libvirt.vif [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:30:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-430944557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-430944557',id=15,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-f51vxjev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:30:41Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=03dff342-d941-4d7e-9ada-afc46435fd14,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.413 187082 DEBUG nova.network.os_vif_util [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.414 187082 DEBUG nova.network.os_vif_util [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:69:d7,bridge_name='br-int',has_traffic_filtering=True,id=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37dfce22-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.415 187082 DEBUG nova.objects.instance [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'pci_devices' on Instance uuid 03dff342-d941-4d7e-9ada-afc46435fd14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.427 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <uuid>03dff342-d941-4d7e-9ada-afc46435fd14</uuid>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <name>instance-0000000f</name>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteStrategies-server-430944557</nova:name>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:30:45</nova:creationTime>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:30:45 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:30:45 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:30:45 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:30:45 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:30:45 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:30:45 compute-1 nova_compute[187078]:         <nova:user uuid="44609a4d2fa941a4b26d6b27a5d4a6d2">tempest-TestExecuteStrategies-392394962-project-member</nova:user>
Nov 24 13:30:45 compute-1 nova_compute[187078]:         <nova:project uuid="a66bcdc071b741ef8709a4608acd6051">tempest-TestExecuteStrategies-392394962</nova:project>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:30:45 compute-1 nova_compute[187078]:         <nova:port uuid="37dfce22-04c0-4d3b-b2ab-31c6e5ce6078">
Nov 24 13:30:45 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <system>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <entry name="serial">03dff342-d941-4d7e-9ada-afc46435fd14</entry>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <entry name="uuid">03dff342-d941-4d7e-9ada-afc46435fd14</entry>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </system>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <os>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   </os>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <features>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   </features>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk.config"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:b8:69:d7"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <target dev="tap37dfce22-04"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/console.log" append="off"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <video>
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </video>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:30:45 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:30:45 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:30:45 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:30:45 compute-1 nova_compute[187078]: </domain>
Nov 24 13:30:45 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.428 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Preparing to wait for external event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.428 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.429 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.429 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.430 187082 DEBUG nova.virt.libvirt.vif [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:30:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-430944557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-430944557',id=15,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-f51vxjev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:30:41Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=03dff342-d941-4d7e-9ada-afc46435fd14,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.430 187082 DEBUG nova.network.os_vif_util [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.431 187082 DEBUG nova.network.os_vif_util [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:69:d7,bridge_name='br-int',has_traffic_filtering=True,id=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37dfce22-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.431 187082 DEBUG os_vif [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:69:d7,bridge_name='br-int',has_traffic_filtering=True,id=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37dfce22-04') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.432 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.432 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.432 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.435 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.435 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37dfce22-04, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.435 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap37dfce22-04, col_values=(('external_ids', {'iface-id': '37dfce22-04c0-4d3b-b2ab-31c6e5ce6078', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:69:d7', 'vm-uuid': '03dff342-d941-4d7e-9ada-afc46435fd14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.437 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:45 compute-1 NetworkManager[55527]: <info>  [1763991045.4378] manager: (tap37dfce22-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.439 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.444 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.445 187082 INFO os_vif [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:69:d7,bridge_name='br-int',has_traffic_filtering=True,id=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37dfce22-04')
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.484 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.484 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.484 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No VIF found with MAC fa:16:3e:b8:69:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:30:45 compute-1 nova_compute[187078]: 2025-11-24 13:30:45.485 187082 INFO nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Using config drive
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.367 187082 INFO nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Creating config drive at /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk.config
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.371 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_2zar9_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.495 187082 DEBUG oslo_concurrency.processutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_2zar9_" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:30:46 compute-1 kernel: tap37dfce22-04: entered promiscuous mode
Nov 24 13:30:46 compute-1 NetworkManager[55527]: <info>  [1763991046.5577] manager: (tap37dfce22-04): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Nov 24 13:30:46 compute-1 ovn_controller[95368]: 2025-11-24T13:30:46Z|00133|binding|INFO|Claiming lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 for this chassis.
Nov 24 13:30:46 compute-1 ovn_controller[95368]: 2025-11-24T13:30:46Z|00134|binding|INFO|37dfce22-04c0-4d3b-b2ab-31c6e5ce6078: Claiming fa:16:3e:b8:69:d7 10.100.0.7
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.559 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.567 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:69:d7 10.100.0.7'], port_security=['fa:16:3e:b8:69:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '03dff342-d941-4d7e-9ada-afc46435fd14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.568 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.569 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:30:46 compute-1 ovn_controller[95368]: 2025-11-24T13:30:46Z|00135|binding|INFO|Setting lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 ovn-installed in OVS
Nov 24 13:30:46 compute-1 ovn_controller[95368]: 2025-11-24T13:30:46Z|00136|binding|INFO|Setting lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 up in Southbound
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.571 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.573 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.576 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.582 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[6d8efcac-f2bf-4df2-8075-20323dda3747]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.583 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee6bf4e1-a1 in ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.590 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee6bf4e1-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.590 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[da0d589f-7df7-4b7c-ba95-00cb7379cc78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 systemd-udevd[213775]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.591 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[91e29fcd-ad2c-40c2-be9b-ccb1ddfefc3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 systemd-machined[153355]: New machine qemu-11-instance-0000000f.
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.602 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[5d05ec69-7e2c-4b2f-b84f-89987bc5a2fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 NetworkManager[55527]: <info>  [1763991046.6105] device (tap37dfce22-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:30:46 compute-1 NetworkManager[55527]: <info>  [1763991046.6126] device (tap37dfce22-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:30:46 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.617 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc7453a-10e4-479b-98c4-43f50382c55b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.648 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[ba5fed1a-152e-4e63-8853-ea539d7475f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.654 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[21249289-7b47-462f-ab42-6f137aa2d298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 NetworkManager[55527]: <info>  [1763991046.6563] manager: (tapee6bf4e1-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Nov 24 13:30:46 compute-1 systemd-udevd[213780]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.685 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[d11c7732-eebc-47fb-b728-758a79f8cd6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.690 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[206641e0-75cb-48a2-a719-8c4fede29b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 NetworkManager[55527]: <info>  [1763991046.7129] device (tapee6bf4e1-a0): carrier: link connected
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.720 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[02d64ee0-0971-4c2c-942c-f0e3e18c7f52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.738 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4c90adc0-1c22-4ae1-949a-57ea4b0baed5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406537, 'reachable_time': 41668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213808, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.750 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[698fb511-8d1a-4fd1-8589-3b428a811d05]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:5bc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 406537, 'tstamp': 406537}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213809, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.760 187082 DEBUG nova.compute.manager [req-9e714952-3cd0-4249-9557-5fdce1f02429 req-2a8a5782-7275-4c00-8bb6-6ed8a635c096 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.760 187082 DEBUG oslo_concurrency.lockutils [req-9e714952-3cd0-4249-9557-5fdce1f02429 req-2a8a5782-7275-4c00-8bb6-6ed8a635c096 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.761 187082 DEBUG oslo_concurrency.lockutils [req-9e714952-3cd0-4249-9557-5fdce1f02429 req-2a8a5782-7275-4c00-8bb6-6ed8a635c096 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.761 187082 DEBUG oslo_concurrency.lockutils [req-9e714952-3cd0-4249-9557-5fdce1f02429 req-2a8a5782-7275-4c00-8bb6-6ed8a635c096 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.761 187082 DEBUG nova.compute.manager [req-9e714952-3cd0-4249-9557-5fdce1f02429 req-2a8a5782-7275-4c00-8bb6-6ed8a635c096 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Processing event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.768 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9a178007-d06b-4e10-a9e4-a6aedc872540]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406537, 'reachable_time': 41668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213810, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.796 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[23611d0e-332f-4294-af08-7b3bcfbae2a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.851 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e022acf9-54f7-4f44-b0a4-14373436a562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.852 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.852 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.852 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.897 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:46 compute-1 NetworkManager[55527]: <info>  [1763991046.9000] manager: (tapee6bf4e1-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Nov 24 13:30:46 compute-1 kernel: tapee6bf4e1-a0: entered promiscuous mode
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.901 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.902 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:30:46 compute-1 ovn_controller[95368]: 2025-11-24T13:30:46Z|00137|binding|INFO|Releasing lport 3f7bb31c-e9f4-4c4a-ad4a-8451f233926d from this chassis (sb_readonly=0)
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.903 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.920 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:30:46 compute-1 nova_compute[187078]: 2025-11-24 13:30:46.919 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.921 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[10aa916b-2667-4702-828a-6ce9578f95e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.922 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:30:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:30:46.923 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'env', 'PROCESS_TAG=haproxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.088 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991047.088317, 03dff342-d941-4d7e-9ada-afc46435fd14 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.090 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] VM Started (Lifecycle Event)
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.092 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.097 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.100 187082 INFO nova.virt.libvirt.driver [-] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Instance spawned successfully.
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.100 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.116 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.121 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.125 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.125 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.126 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.127 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.127 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.128 187082 DEBUG nova.virt.libvirt.driver [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.136 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.136 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991047.0885203, 03dff342-d941-4d7e-9ada-afc46435fd14 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.137 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] VM Paused (Lifecycle Event)
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.154 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.157 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991047.0955646, 03dff342-d941-4d7e-9ada-afc46435fd14 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.158 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] VM Resumed (Lifecycle Event)
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.176 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.184 187082 INFO nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Took 5.81 seconds to spawn the instance on the hypervisor.
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.185 187082 DEBUG nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.186 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.216 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.249 187082 INFO nova.compute.manager [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Took 6.32 seconds to build instance.
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.263 187082 DEBUG oslo_concurrency.lockutils [None req-fef912d4-bebf-4311-90fd-f8d3627179d2 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:47 compute-1 podman[213849]: 2025-11-24 13:30:47.256602882 +0000 UTC m=+0.022050019 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:30:47 compute-1 podman[213849]: 2025-11-24 13:30:47.413148307 +0000 UTC m=+0.178595414 container create de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 13:30:47 compute-1 systemd[1]: Started libpod-conmon-de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7.scope.
Nov 24 13:30:47 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.516 187082 DEBUG nova.network.neutron [req-dfe115d0-5557-40dd-92d5-9d60bff76975 req-36853924-5a8b-446b-839d-10c91590780e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Updated VIF entry in instance network info cache for port 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:30:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec7b3a562e07a9273eb5ab5e0b29d91ef936b418a191aef416600dee8ead2d4b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.520 187082 DEBUG nova.network.neutron [req-dfe115d0-5557-40dd-92d5-9d60bff76975 req-36853924-5a8b-446b-839d-10c91590780e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Updating instance_info_cache with network_info: [{"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:30:47 compute-1 nova_compute[187078]: 2025-11-24 13:30:47.530 187082 DEBUG oslo_concurrency.lockutils [req-dfe115d0-5557-40dd-92d5-9d60bff76975 req-36853924-5a8b-446b-839d-10c91590780e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:30:47 compute-1 podman[213849]: 2025-11-24 13:30:47.577217809 +0000 UTC m=+0.342664946 container init de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 24 13:30:47 compute-1 podman[213849]: 2025-11-24 13:30:47.582326033 +0000 UTC m=+0.347773130 container start de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:30:47 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213865]: [NOTICE]   (213869) : New worker (213871) forked
Nov 24 13:30:47 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213865]: [NOTICE]   (213869) : Loading success.
Nov 24 13:30:48 compute-1 nova_compute[187078]: 2025-11-24 13:30:48.233 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:48 compute-1 podman[213880]: 2025-11-24 13:30:48.543124586 +0000 UTC m=+0.093460521 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 24 13:30:48 compute-1 podman[213881]: 2025-11-24 13:30:48.563079699 +0000 UTC m=+0.100058184 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 13:30:48 compute-1 nova_compute[187078]: 2025-11-24 13:30:48.856 187082 DEBUG nova.compute.manager [req-9e89b2d2-f252-468d-8b9d-2a54f0653332 req-abd75cf8-6f9e-4c80-97dc-f74143b2334e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:30:48 compute-1 nova_compute[187078]: 2025-11-24 13:30:48.857 187082 DEBUG oslo_concurrency.lockutils [req-9e89b2d2-f252-468d-8b9d-2a54f0653332 req-abd75cf8-6f9e-4c80-97dc-f74143b2334e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:30:48 compute-1 nova_compute[187078]: 2025-11-24 13:30:48.857 187082 DEBUG oslo_concurrency.lockutils [req-9e89b2d2-f252-468d-8b9d-2a54f0653332 req-abd75cf8-6f9e-4c80-97dc-f74143b2334e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:30:48 compute-1 nova_compute[187078]: 2025-11-24 13:30:48.857 187082 DEBUG oslo_concurrency.lockutils [req-9e89b2d2-f252-468d-8b9d-2a54f0653332 req-abd75cf8-6f9e-4c80-97dc-f74143b2334e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:30:48 compute-1 nova_compute[187078]: 2025-11-24 13:30:48.857 187082 DEBUG nova.compute.manager [req-9e89b2d2-f252-468d-8b9d-2a54f0653332 req-abd75cf8-6f9e-4c80-97dc-f74143b2334e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] No waiting events found dispatching network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:30:48 compute-1 nova_compute[187078]: 2025-11-24 13:30:48.858 187082 WARNING nova.compute.manager [req-9e89b2d2-f252-468d-8b9d-2a54f0653332 req-abd75cf8-6f9e-4c80-97dc-f74143b2334e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received unexpected event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 for instance with vm_state active and task_state None.
Nov 24 13:30:49 compute-1 openstack_network_exporter[199599]: ERROR   13:30:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:30:49 compute-1 openstack_network_exporter[199599]: ERROR   13:30:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:30:49 compute-1 openstack_network_exporter[199599]: ERROR   13:30:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:30:49 compute-1 openstack_network_exporter[199599]: ERROR   13:30:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:30:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:30:49 compute-1 openstack_network_exporter[199599]: ERROR   13:30:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:30:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:30:50 compute-1 nova_compute[187078]: 2025-11-24 13:30:50.437 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:53 compute-1 nova_compute[187078]: 2025-11-24 13:30:53.234 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:55 compute-1 nova_compute[187078]: 2025-11-24 13:30:55.439 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:30:56 compute-1 podman[213921]: 2025-11-24 13:30:56.53875372 +0000 UTC m=+0.079333751 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public)
Nov 24 13:30:58 compute-1 nova_compute[187078]: 2025-11-24 13:30:58.236 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:00 compute-1 nova_compute[187078]: 2025-11-24 13:31:00.442 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:00 compute-1 ovn_controller[95368]: 2025-11-24T13:31:00Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:69:d7 10.100.0.7
Nov 24 13:31:00 compute-1 ovn_controller[95368]: 2025-11-24T13:31:00Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:69:d7 10.100.0.7
Nov 24 13:31:01 compute-1 sshd-session[213954]: Invalid user intell from 5.198.176.28 port 44654
Nov 24 13:31:01 compute-1 sshd-session[213954]: Received disconnect from 5.198.176.28 port 44654:11: Bye Bye [preauth]
Nov 24 13:31:01 compute-1 sshd-session[213954]: Disconnected from invalid user intell 5.198.176.28 port 44654 [preauth]
Nov 24 13:31:03 compute-1 nova_compute[187078]: 2025-11-24 13:31:03.237 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:03 compute-1 sshd-session[213956]: Invalid user svn from 45.78.217.131 port 34870
Nov 24 13:31:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:04.158 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:31:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:04.159 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:31:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:04.159 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:31:04 compute-1 sshd-session[213956]: Received disconnect from 45.78.217.131 port 34870:11: Bye Bye [preauth]
Nov 24 13:31:04 compute-1 sshd-session[213956]: Disconnected from invalid user svn 45.78.217.131 port 34870 [preauth]
Nov 24 13:31:05 compute-1 nova_compute[187078]: 2025-11-24 13:31:05.446 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:05 compute-1 podman[197429]: time="2025-11-24T13:31:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:31:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:31:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:31:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:31:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Nov 24 13:31:08 compute-1 nova_compute[187078]: 2025-11-24 13:31:08.238 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:10 compute-1 nova_compute[187078]: 2025-11-24 13:31:10.448 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:10 compute-1 nova_compute[187078]: 2025-11-24 13:31:10.683 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:11 compute-1 podman[213959]: 2025-11-24 13:31:11.497633817 +0000 UTC m=+0.045334159 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 13:31:11 compute-1 podman[213958]: 2025-11-24 13:31:11.500689678 +0000 UTC m=+0.050285390 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:31:11 compute-1 nova_compute[187078]: 2025-11-24 13:31:11.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:13 compute-1 sshd-session[214001]: Invalid user cgpexpert from 175.100.24.139 port 35016
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.240 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:13 compute-1 sshd-session[214001]: Received disconnect from 175.100.24.139 port 35016:11: Bye Bye [preauth]
Nov 24 13:31:13 compute-1 sshd-session[214001]: Disconnected from invalid user cgpexpert 175.100.24.139 port 35016 [preauth]
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.694 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.694 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.757 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.821 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.822 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:13 compute-1 nova_compute[187078]: 2025-11-24 13:31:13.892 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.019 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.021 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5700MB free_disk=73.43087005615234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.021 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.021 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.148 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 03dff342-d941-4d7e-9ada-afc46435fd14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.148 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.148 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.178 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.188 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.214 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:31:14 compute-1 nova_compute[187078]: 2025-11-24 13:31:14.214 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:31:15 compute-1 nova_compute[187078]: 2025-11-24 13:31:15.450 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:16 compute-1 sshd-session[214010]: Invalid user sol from 45.148.10.240 port 57330
Nov 24 13:31:16 compute-1 nova_compute[187078]: 2025-11-24 13:31:16.214 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:16 compute-1 nova_compute[187078]: 2025-11-24 13:31:16.215 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:31:16 compute-1 nova_compute[187078]: 2025-11-24 13:31:16.215 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:31:16 compute-1 sshd-session[214010]: Connection closed by invalid user sol 45.148.10.240 port 57330 [preauth]
Nov 24 13:31:17 compute-1 nova_compute[187078]: 2025-11-24 13:31:17.004 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:31:17 compute-1 nova_compute[187078]: 2025-11-24 13:31:17.004 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:31:17 compute-1 nova_compute[187078]: 2025-11-24 13:31:17.004 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:31:17 compute-1 nova_compute[187078]: 2025-11-24 13:31:17.004 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 03dff342-d941-4d7e-9ada-afc46435fd14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:31:18 compute-1 nova_compute[187078]: 2025-11-24 13:31:18.241 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:19 compute-1 openstack_network_exporter[199599]: ERROR   13:31:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:31:19 compute-1 openstack_network_exporter[199599]: ERROR   13:31:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:31:19 compute-1 openstack_network_exporter[199599]: ERROR   13:31:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:31:19 compute-1 openstack_network_exporter[199599]: ERROR   13:31:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:31:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:31:19 compute-1 openstack_network_exporter[199599]: ERROR   13:31:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:31:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:31:19 compute-1 podman[214012]: 2025-11-24 13:31:19.548803447 +0000 UTC m=+0.071483315 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:31:19 compute-1 podman[214013]: 2025-11-24 13:31:19.55809069 +0000 UTC m=+0.087553416 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 13:31:19 compute-1 nova_compute[187078]: 2025-11-24 13:31:19.915 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Updating instance_info_cache with network_info: [{"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:31:19 compute-1 nova_compute[187078]: 2025-11-24 13:31:19.932 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-03dff342-d941-4d7e-9ada-afc46435fd14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:31:19 compute-1 nova_compute[187078]: 2025-11-24 13:31:19.933 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:31:19 compute-1 nova_compute[187078]: 2025-11-24 13:31:19.934 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:19 compute-1 nova_compute[187078]: 2025-11-24 13:31:19.934 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:19 compute-1 nova_compute[187078]: 2025-11-24 13:31:19.934 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:19 compute-1 nova_compute[187078]: 2025-11-24 13:31:19.935 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:31:20 compute-1 nova_compute[187078]: 2025-11-24 13:31:20.497 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:22 compute-1 nova_compute[187078]: 2025-11-24 13:31:22.380 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:31:23 compute-1 nova_compute[187078]: 2025-11-24 13:31:23.243 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:23 compute-1 ovn_controller[95368]: 2025-11-24T13:31:23Z|00138|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 24 13:31:25 compute-1 nova_compute[187078]: 2025-11-24 13:31:25.548 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:27 compute-1 sshd-session[214059]: Invalid user postgres from 176.114.89.34 port 60428
Nov 24 13:31:27 compute-1 podman[214061]: 2025-11-24 13:31:27.549854263 +0000 UTC m=+0.092863896 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 13:31:27 compute-1 sshd-session[214059]: Received disconnect from 176.114.89.34 port 60428:11: Bye Bye [preauth]
Nov 24 13:31:27 compute-1 sshd-session[214059]: Disconnected from invalid user postgres 176.114.89.34 port 60428 [preauth]
Nov 24 13:31:28 compute-1 nova_compute[187078]: 2025-11-24 13:31:28.245 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:29 compute-1 sshd-session[214084]: Invalid user ubuntu from 193.32.162.146 port 55696
Nov 24 13:31:29 compute-1 sshd-session[214084]: Connection closed by invalid user ubuntu 193.32.162.146 port 55696 [preauth]
Nov 24 13:31:30 compute-1 nova_compute[187078]: 2025-11-24 13:31:30.551 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:33 compute-1 nova_compute[187078]: 2025-11-24 13:31:33.286 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:35 compute-1 nova_compute[187078]: 2025-11-24 13:31:35.554 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:35 compute-1 podman[197429]: time="2025-11-24T13:31:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:31:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:31:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:31:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:31:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 24 13:31:38 compute-1 nova_compute[187078]: 2025-11-24 13:31:38.290 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:38 compute-1 sshd-session[214087]: Received disconnect from 85.209.134.43 port 34972:11: Bye Bye [preauth]
Nov 24 13:31:38 compute-1 sshd-session[214087]: Disconnected from authenticating user root 85.209.134.43 port 34972 [preauth]
Nov 24 13:31:40 compute-1 nova_compute[187078]: 2025-11-24 13:31:40.557 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:42 compute-1 podman[214090]: 2025-11-24 13:31:42.535505084 +0000 UTC m=+0.067855131 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:31:42 compute-1 podman[214089]: 2025-11-24 13:31:42.557885721 +0000 UTC m=+0.098195666 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:31:42 compute-1 nova_compute[187078]: 2025-11-24 13:31:42.647 187082 DEBUG nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Creating tmpfile /var/lib/nova/instances/tmphnxqeq1k to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 24 13:31:42 compute-1 nova_compute[187078]: 2025-11-24 13:31:42.648 187082 DEBUG nova.compute.manager [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphnxqeq1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 24 13:31:43 compute-1 nova_compute[187078]: 2025-11-24 13:31:43.289 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:43 compute-1 nova_compute[187078]: 2025-11-24 13:31:43.945 187082 DEBUG nova.compute.manager [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphnxqeq1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af2576d1-0abd-404f-a855-04e193b197e3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 24 13:31:43 compute-1 nova_compute[187078]: 2025-11-24 13:31:43.976 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-af2576d1-0abd-404f-a855-04e193b197e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:31:43 compute-1 nova_compute[187078]: 2025-11-24 13:31:43.977 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-af2576d1-0abd-404f-a855-04e193b197e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:31:43 compute-1 nova_compute[187078]: 2025-11-24 13:31:43.977 187082 DEBUG nova.network.neutron [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:31:44 compute-1 sshd-session[214133]: Invalid user dev from 68.183.82.237 port 36648
Nov 24 13:31:44 compute-1 sshd-session[214133]: Received disconnect from 68.183.82.237 port 36648:11: Bye Bye [preauth]
Nov 24 13:31:44 compute-1 sshd-session[214133]: Disconnected from invalid user dev 68.183.82.237 port 36648 [preauth]
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.528 187082 DEBUG nova.network.neutron [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Updating instance_info_cache with network_info: [{"id": "909858e9-ece1-4e65-970f-cee27b0b4525", "address": "fa:16:3e:e9:8d:61", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909858e9-ec", "ovs_interfaceid": "909858e9-ece1-4e65-970f-cee27b0b4525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.544 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-af2576d1-0abd-404f-a855-04e193b197e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.547 187082 DEBUG nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphnxqeq1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af2576d1-0abd-404f-a855-04e193b197e3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.548 187082 DEBUG nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Creating instance directory: /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.549 187082 DEBUG nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Creating disk.info with the contents: {'/var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk': 'qcow2', '/var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.550 187082 DEBUG nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.550 187082 DEBUG nova.objects.instance [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'trusted_certs' on Instance uuid af2576d1-0abd-404f-a855-04e193b197e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.586 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.613 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.679 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.680 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.681 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.692 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.749 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.750 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.783 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.784 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.785 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.845 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.846 187082 DEBUG nova.virt.disk.api [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Checking if we can resize image /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.847 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.905 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.906 187082 DEBUG nova.virt.disk.api [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Cannot resize image /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.906 187082 DEBUG nova.objects.instance [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid af2576d1-0abd-404f-a855-04e193b197e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.922 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.952 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk.config 485376" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.954 187082 DEBUG nova.virt.libvirt.volume.remotefs [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk.config to /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 24 13:31:45 compute-1 nova_compute[187078]: 2025-11-24 13:31:45.954 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk.config /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.527 187082 DEBUG oslo_concurrency.processutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3/disk.config /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3" returned: 0 in 0.573s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.529 187082 DEBUG nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.531 187082 DEBUG nova.virt.libvirt.vif [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:30:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1416113278',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1416113278',id=16,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:30:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-t6qnvf7q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:30:58Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=af2576d1-0abd-404f-a855-04e193b197e3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "909858e9-ece1-4e65-970f-cee27b0b4525", "address": "fa:16:3e:e9:8d:61", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap909858e9-ec", "ovs_interfaceid": "909858e9-ece1-4e65-970f-cee27b0b4525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.532 187082 DEBUG nova.network.os_vif_util [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "909858e9-ece1-4e65-970f-cee27b0b4525", "address": "fa:16:3e:e9:8d:61", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap909858e9-ec", "ovs_interfaceid": "909858e9-ece1-4e65-970f-cee27b0b4525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.534 187082 DEBUG nova.network.os_vif_util [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:8d:61,bridge_name='br-int',has_traffic_filtering=True,id=909858e9-ece1-4e65-970f-cee27b0b4525,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909858e9-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.534 187082 DEBUG os_vif [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:8d:61,bridge_name='br-int',has_traffic_filtering=True,id=909858e9-ece1-4e65-970f-cee27b0b4525,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909858e9-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.536 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.537 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.537 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.541 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.542 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap909858e9-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.542 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap909858e9-ec, col_values=(('external_ids', {'iface-id': '909858e9-ece1-4e65-970f-cee27b0b4525', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:8d:61', 'vm-uuid': 'af2576d1-0abd-404f-a855-04e193b197e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.545 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:46 compute-1 NetworkManager[55527]: <info>  [1763991106.5467] manager: (tap909858e9-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.548 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.557 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.559 187082 INFO os_vif [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:8d:61,bridge_name='br-int',has_traffic_filtering=True,id=909858e9-ece1-4e65-970f-cee27b0b4525,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909858e9-ec')
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.560 187082 DEBUG nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 24 13:31:46 compute-1 nova_compute[187078]: 2025-11-24 13:31:46.561 187082 DEBUG nova.compute.manager [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphnxqeq1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af2576d1-0abd-404f-a855-04e193b197e3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 24 13:31:48 compute-1 nova_compute[187078]: 2025-11-24 13:31:48.291 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:48.342 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:31:48 compute-1 nova_compute[187078]: 2025-11-24 13:31:48.344 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:48 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:48.346 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:31:49 compute-1 openstack_network_exporter[199599]: ERROR   13:31:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:31:49 compute-1 openstack_network_exporter[199599]: ERROR   13:31:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:31:49 compute-1 openstack_network_exporter[199599]: ERROR   13:31:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:31:49 compute-1 openstack_network_exporter[199599]: ERROR   13:31:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:31:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:31:49 compute-1 openstack_network_exporter[199599]: ERROR   13:31:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:31:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:31:49 compute-1 nova_compute[187078]: 2025-11-24 13:31:49.623 187082 DEBUG nova.network.neutron [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Port 909858e9-ece1-4e65-970f-cee27b0b4525 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 24 13:31:49 compute-1 nova_compute[187078]: 2025-11-24 13:31:49.626 187082 DEBUG nova.compute.manager [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphnxqeq1k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af2576d1-0abd-404f-a855-04e193b197e3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 24 13:31:49 compute-1 systemd[1]: Starting libvirt proxy daemon...
Nov 24 13:31:49 compute-1 systemd[1]: Started libvirt proxy daemon.
Nov 24 13:31:49 compute-1 podman[214157]: 2025-11-24 13:31:49.950561716 +0000 UTC m=+0.074493315 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:31:50 compute-1 podman[214158]: 2025-11-24 13:31:50.07013294 +0000 UTC m=+0.178453029 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 24 13:31:50 compute-1 NetworkManager[55527]: <info>  [1763991110.0917] manager: (tap909858e9-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Nov 24 13:31:50 compute-1 kernel: tap909858e9-ec: entered promiscuous mode
Nov 24 13:31:50 compute-1 ovn_controller[95368]: 2025-11-24T13:31:50Z|00139|binding|INFO|Claiming lport 909858e9-ece1-4e65-970f-cee27b0b4525 for this additional chassis.
Nov 24 13:31:50 compute-1 ovn_controller[95368]: 2025-11-24T13:31:50Z|00140|binding|INFO|909858e9-ece1-4e65-970f-cee27b0b4525: Claiming fa:16:3e:e9:8d:61 10.100.0.6
Nov 24 13:31:50 compute-1 nova_compute[187078]: 2025-11-24 13:31:50.096 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:50 compute-1 ovn_controller[95368]: 2025-11-24T13:31:50Z|00141|binding|INFO|Setting lport 909858e9-ece1-4e65-970f-cee27b0b4525 ovn-installed in OVS
Nov 24 13:31:50 compute-1 nova_compute[187078]: 2025-11-24 13:31:50.116 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:50 compute-1 nova_compute[187078]: 2025-11-24 13:31:50.120 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:50 compute-1 systemd-udevd[214235]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:31:50 compute-1 systemd-machined[153355]: New machine qemu-12-instance-00000010.
Nov 24 13:31:50 compute-1 NetworkManager[55527]: <info>  [1763991110.1649] device (tap909858e9-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:31:50 compute-1 NetworkManager[55527]: <info>  [1763991110.1661] device (tap909858e9-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:31:50 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-00000010.
Nov 24 13:31:51 compute-1 nova_compute[187078]: 2025-11-24 13:31:51.245 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991111.2445545, af2576d1-0abd-404f-a855-04e193b197e3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:31:51 compute-1 nova_compute[187078]: 2025-11-24 13:31:51.246 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: af2576d1-0abd-404f-a855-04e193b197e3] VM Started (Lifecycle Event)
Nov 24 13:31:51 compute-1 nova_compute[187078]: 2025-11-24 13:31:51.265 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:31:51 compute-1 nova_compute[187078]: 2025-11-24 13:31:51.546 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:51 compute-1 nova_compute[187078]: 2025-11-24 13:31:51.961 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991111.9606562, af2576d1-0abd-404f-a855-04e193b197e3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:31:51 compute-1 nova_compute[187078]: 2025-11-24 13:31:51.961 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: af2576d1-0abd-404f-a855-04e193b197e3] VM Resumed (Lifecycle Event)
Nov 24 13:31:51 compute-1 nova_compute[187078]: 2025-11-24 13:31:51.988 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:31:51 compute-1 nova_compute[187078]: 2025-11-24 13:31:51.993 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:31:52 compute-1 nova_compute[187078]: 2025-11-24 13:31:52.010 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: af2576d1-0abd-404f-a855-04e193b197e3] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Nov 24 13:31:52 compute-1 ovn_controller[95368]: 2025-11-24T13:31:52Z|00142|binding|INFO|Claiming lport 909858e9-ece1-4e65-970f-cee27b0b4525 for this chassis.
Nov 24 13:31:52 compute-1 ovn_controller[95368]: 2025-11-24T13:31:52Z|00143|binding|INFO|909858e9-ece1-4e65-970f-cee27b0b4525: Claiming fa:16:3e:e9:8d:61 10.100.0.6
Nov 24 13:31:52 compute-1 ovn_controller[95368]: 2025-11-24T13:31:52Z|00144|binding|INFO|Setting lport 909858e9-ece1-4e65-970f-cee27b0b4525 up in Southbound
Nov 24 13:31:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:52.899 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:8d:61 10.100.0.6'], port_security=['fa:16:3e:e9:8d:61 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'af2576d1-0abd-404f-a855-04e193b197e3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '11', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=909858e9-ece1-4e65-970f-cee27b0b4525) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:31:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:52.901 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 909858e9-ece1-4e65-970f-cee27b0b4525 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:31:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:52.903 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:31:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:52.929 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[120481b0-33f0-46f8-8063-e2e9d3429320]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:31:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:52.981 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[26d2d5d4-e7f3-4417-bce8-16bd1be3b99f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:31:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:52.986 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd8daea-b2ca-4ae1-a9ac-6890a7ce6519]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:31:52 compute-1 sshd-session[214264]: banner exchange: Connection from 65.49.1.192 port 55272: invalid format
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.030 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[a90fe8ec-e439-49b4-87ac-0554c45bf988]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.056 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[12d5a2c5-0bd9-46ee-8e70-17e1b921a3ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406537, 'reachable_time': 41668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214267, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.082 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[96f5861b-cfb9-422d-835c-2cc82342ddd4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee6bf4e1-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 406547, 'tstamp': 406547}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214268, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee6bf4e1-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 406550, 'tstamp': 406550}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214268, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.084 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:31:53 compute-1 nova_compute[187078]: 2025-11-24 13:31:53.086 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:53 compute-1 nova_compute[187078]: 2025-11-24 13:31:53.087 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.089 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.090 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.091 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.093 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:31:53 compute-1 nova_compute[187078]: 2025-11-24 13:31:53.294 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:31:53.348 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:31:54 compute-1 nova_compute[187078]: 2025-11-24 13:31:54.449 187082 INFO nova.compute.manager [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Post operation of migration started
Nov 24 13:31:55 compute-1 nova_compute[187078]: 2025-11-24 13:31:55.127 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-af2576d1-0abd-404f-a855-04e193b197e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:31:55 compute-1 nova_compute[187078]: 2025-11-24 13:31:55.128 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-af2576d1-0abd-404f-a855-04e193b197e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:31:55 compute-1 nova_compute[187078]: 2025-11-24 13:31:55.128 187082 DEBUG nova.network.neutron [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:31:56 compute-1 nova_compute[187078]: 2025-11-24 13:31:56.551 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:56 compute-1 nova_compute[187078]: 2025-11-24 13:31:56.650 187082 DEBUG nova.network.neutron [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Updating instance_info_cache with network_info: [{"id": "909858e9-ece1-4e65-970f-cee27b0b4525", "address": "fa:16:3e:e9:8d:61", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909858e9-ec", "ovs_interfaceid": "909858e9-ece1-4e65-970f-cee27b0b4525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:31:56 compute-1 nova_compute[187078]: 2025-11-24 13:31:56.672 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-af2576d1-0abd-404f-a855-04e193b197e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:31:56 compute-1 nova_compute[187078]: 2025-11-24 13:31:56.691 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:31:56 compute-1 nova_compute[187078]: 2025-11-24 13:31:56.692 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:31:56 compute-1 nova_compute[187078]: 2025-11-24 13:31:56.692 187082 DEBUG oslo_concurrency.lockutils [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:31:56 compute-1 nova_compute[187078]: 2025-11-24 13:31:56.702 187082 INFO nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 24 13:31:56 compute-1 virtqemud[186628]: Domain id=12 name='instance-00000010' uuid=af2576d1-0abd-404f-a855-04e193b197e3 is tainted: custom-monitor
Nov 24 13:31:57 compute-1 nova_compute[187078]: 2025-11-24 13:31:57.712 187082 INFO nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 24 13:31:58 compute-1 nova_compute[187078]: 2025-11-24 13:31:58.296 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:31:58 compute-1 podman[214272]: 2025-11-24 13:31:58.559038649 +0000 UTC m=+0.091699445 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Nov 24 13:31:58 compute-1 nova_compute[187078]: 2025-11-24 13:31:58.719 187082 INFO nova.virt.libvirt.driver [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 24 13:31:58 compute-1 nova_compute[187078]: 2025-11-24 13:31:58.727 187082 DEBUG nova.compute.manager [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:31:58 compute-1 nova_compute[187078]: 2025-11-24 13:31:58.746 187082 DEBUG nova.objects.instance [None req-912ed3ba-29cd-4dff-8ff3-3281c19c5e0f 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 24 13:31:59 compute-1 sshd-session[214269]: Received disconnect from 45.78.194.40 port 47116:11: Bye Bye [preauth]
Nov 24 13:31:59 compute-1 sshd-session[214269]: Disconnected from authenticating user root 45.78.194.40 port 47116 [preauth]
Nov 24 13:32:01 compute-1 nova_compute[187078]: 2025-11-24 13:32:01.555 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:03 compute-1 nova_compute[187078]: 2025-11-24 13:32:03.298 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.159 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.160 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.160 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.667 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "af2576d1-0abd-404f-a855-04e193b197e3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.668 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "af2576d1-0abd-404f-a855-04e193b197e3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.668 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "af2576d1-0abd-404f-a855-04e193b197e3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.669 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "af2576d1-0abd-404f-a855-04e193b197e3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.669 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "af2576d1-0abd-404f-a855-04e193b197e3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.672 187082 INFO nova.compute.manager [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Terminating instance
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.673 187082 DEBUG nova.compute.manager [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:32:04 compute-1 kernel: tap909858e9-ec (unregistering): left promiscuous mode
Nov 24 13:32:04 compute-1 NetworkManager[55527]: <info>  [1763991124.7047] device (tap909858e9-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.721 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 ovn_controller[95368]: 2025-11-24T13:32:04Z|00145|binding|INFO|Releasing lport 909858e9-ece1-4e65-970f-cee27b0b4525 from this chassis (sb_readonly=0)
Nov 24 13:32:04 compute-1 ovn_controller[95368]: 2025-11-24T13:32:04Z|00146|binding|INFO|Setting lport 909858e9-ece1-4e65-970f-cee27b0b4525 down in Southbound
Nov 24 13:32:04 compute-1 ovn_controller[95368]: 2025-11-24T13:32:04Z|00147|binding|INFO|Removing iface tap909858e9-ec ovn-installed in OVS
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.726 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.732 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:8d:61 10.100.0.6'], port_security=['fa:16:3e:e9:8d:61 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'af2576d1-0abd-404f-a855-04e193b197e3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '13', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=909858e9-ece1-4e65-970f-cee27b0b4525) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.734 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 909858e9-ece1-4e65-970f-cee27b0b4525 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.735 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.745 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.760 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f908cfe6-b373-4652-bb8b-c5cc0feb139e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:04 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 24 13:32:04 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Consumed 2.284s CPU time.
Nov 24 13:32:04 compute-1 systemd-machined[153355]: Machine qemu-12-instance-00000010 terminated.
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.804 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9afb49-d188-499c-8cc8-c6ebf0e7bad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.809 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[5f899a5d-b8e2-4d89-a3e2-71a316d3d1d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.854 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4b455a-a3f0-4a9c-90f4-1b8a3155719a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.883 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[cec1f6b8-4abc-4f5d-8912-c888f9859a5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406537, 'reachable_time': 41668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214319, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.904 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.909 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.917 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[61b63902-dea2-40ef-8b85-9bba22177cb9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee6bf4e1-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 406547, 'tstamp': 406547}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214321, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee6bf4e1-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 406550, 'tstamp': 406550}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214321, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.919 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.921 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.926 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.929 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.929 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.930 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:04.930 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.954 187082 INFO nova.virt.libvirt.driver [-] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Instance destroyed successfully.
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.954 187082 DEBUG nova.objects.instance [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'resources' on Instance uuid af2576d1-0abd-404f-a855-04e193b197e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.966 187082 DEBUG nova.virt.libvirt.vif [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-24T13:30:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1416113278',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1416113278',id=16,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:30:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-t6qnvf7q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:31:58Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=af2576d1-0abd-404f-a855-04e193b197e3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "909858e9-ece1-4e65-970f-cee27b0b4525", "address": "fa:16:3e:e9:8d:61", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909858e9-ec", "ovs_interfaceid": "909858e9-ece1-4e65-970f-cee27b0b4525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.967 187082 DEBUG nova.network.os_vif_util [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "909858e9-ece1-4e65-970f-cee27b0b4525", "address": "fa:16:3e:e9:8d:61", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909858e9-ec", "ovs_interfaceid": "909858e9-ece1-4e65-970f-cee27b0b4525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.967 187082 DEBUG nova.network.os_vif_util [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:8d:61,bridge_name='br-int',has_traffic_filtering=True,id=909858e9-ece1-4e65-970f-cee27b0b4525,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909858e9-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.968 187082 DEBUG os_vif [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:8d:61,bridge_name='br-int',has_traffic_filtering=True,id=909858e9-ece1-4e65-970f-cee27b0b4525,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909858e9-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.969 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.970 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap909858e9-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.972 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.974 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.976 187082 INFO os_vif [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:8d:61,bridge_name='br-int',has_traffic_filtering=True,id=909858e9-ece1-4e65-970f-cee27b0b4525,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909858e9-ec')
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.977 187082 INFO nova.virt.libvirt.driver [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Deleting instance files /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3_del
Nov 24 13:32:04 compute-1 nova_compute[187078]: 2025-11-24 13:32:04.977 187082 INFO nova.virt.libvirt.driver [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Deletion of /var/lib/nova/instances/af2576d1-0abd-404f-a855-04e193b197e3_del complete
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.028 187082 INFO nova.compute.manager [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.029 187082 DEBUG oslo.service.loopingcall [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.030 187082 DEBUG nova.compute.manager [-] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.030 187082 DEBUG nova.network.neutron [-] [instance: af2576d1-0abd-404f-a855-04e193b197e3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.266 187082 DEBUG nova.compute.manager [req-71b42039-c53a-4aa5-beff-37388b59ed82 req-5e13504f-9463-4a99-8276-18b0c743e304 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Received event network-vif-unplugged-909858e9-ece1-4e65-970f-cee27b0b4525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.267 187082 DEBUG oslo_concurrency.lockutils [req-71b42039-c53a-4aa5-beff-37388b59ed82 req-5e13504f-9463-4a99-8276-18b0c743e304 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "af2576d1-0abd-404f-a855-04e193b197e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.267 187082 DEBUG oslo_concurrency.lockutils [req-71b42039-c53a-4aa5-beff-37388b59ed82 req-5e13504f-9463-4a99-8276-18b0c743e304 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "af2576d1-0abd-404f-a855-04e193b197e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.267 187082 DEBUG oslo_concurrency.lockutils [req-71b42039-c53a-4aa5-beff-37388b59ed82 req-5e13504f-9463-4a99-8276-18b0c743e304 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "af2576d1-0abd-404f-a855-04e193b197e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.268 187082 DEBUG nova.compute.manager [req-71b42039-c53a-4aa5-beff-37388b59ed82 req-5e13504f-9463-4a99-8276-18b0c743e304 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] No waiting events found dispatching network-vif-unplugged-909858e9-ece1-4e65-970f-cee27b0b4525 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.268 187082 DEBUG nova.compute.manager [req-71b42039-c53a-4aa5-beff-37388b59ed82 req-5e13504f-9463-4a99-8276-18b0c743e304 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Received event network-vif-unplugged-909858e9-ece1-4e65-970f-cee27b0b4525 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.613 187082 DEBUG nova.network.neutron [-] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.628 187082 INFO nova.compute.manager [-] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Took 0.60 seconds to deallocate network for instance.
Nov 24 13:32:05 compute-1 podman[197429]: time="2025-11-24T13:32:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:32:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:32:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:32:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:32:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3057 "" "Go-http-client/1.1"
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.663 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.664 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.671 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.699 187082 INFO nova.scheduler.client.report [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Deleted allocations for instance af2576d1-0abd-404f-a855-04e193b197e3
Nov 24 13:32:05 compute-1 nova_compute[187078]: 2025-11-24 13:32:05.747 187082 DEBUG oslo_concurrency.lockutils [None req-bf2a54b1-fbb9-4026-be91-6ff1ee26f5bb 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "af2576d1-0abd-404f-a855-04e193b197e3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.638 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.638 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.639 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.639 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.639 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.640 187082 INFO nova.compute.manager [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Terminating instance
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.641 187082 DEBUG nova.compute.manager [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:32:06 compute-1 kernel: tap37dfce22-04 (unregistering): left promiscuous mode
Nov 24 13:32:06 compute-1 NetworkManager[55527]: <info>  [1763991126.6697] device (tap37dfce22-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00148|binding|INFO|Releasing lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 from this chassis (sb_readonly=0)
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00149|binding|INFO|Setting lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 down in Southbound
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.677 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00150|binding|INFO|Removing iface tap37dfce22-04 ovn-installed in OVS
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.679 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.686 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:69:d7 10.100.0.7'], port_security=['fa:16:3e:b8:69:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '03dff342-d941-4d7e-9ada-afc46435fd14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.687 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.688 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.689 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[6caf5a72-2f73-4330-81f5-c0fe038a27e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.689 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace which is not needed anymore
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.695 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:06 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Nov 24 13:32:06 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 16.586s CPU time.
Nov 24 13:32:06 compute-1 systemd-machined[153355]: Machine qemu-11-instance-0000000f terminated.
Nov 24 13:32:06 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213865]: [NOTICE]   (213869) : haproxy version is 2.8.14-c23fe91
Nov 24 13:32:06 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213865]: [NOTICE]   (213869) : path to executable is /usr/sbin/haproxy
Nov 24 13:32:06 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213865]: [WARNING]  (213869) : Exiting Master process...
Nov 24 13:32:06 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213865]: [WARNING]  (213869) : Exiting Master process...
Nov 24 13:32:06 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213865]: [ALERT]    (213869) : Current worker (213871) exited with code 143 (Terminated)
Nov 24 13:32:06 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[213865]: [WARNING]  (213869) : All workers exited. Exiting... (0)
Nov 24 13:32:06 compute-1 systemd[1]: libpod-de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7.scope: Deactivated successfully.
Nov 24 13:32:06 compute-1 podman[214360]: 2025-11-24 13:32:06.836116693 +0000 UTC m=+0.048164454 container died de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:32:06 compute-1 kernel: tap37dfce22-04: entered promiscuous mode
Nov 24 13:32:06 compute-1 NetworkManager[55527]: <info>  [1763991126.8631] manager: (tap37dfce22-04): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Nov 24 13:32:06 compute-1 systemd-udevd[214311]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.864 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00151|binding|INFO|Claiming lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 for this chassis.
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00152|binding|INFO|37dfce22-04c0-4d3b-b2ab-31c6e5ce6078: Claiming fa:16:3e:b8:69:d7 10.100.0.7
Nov 24 13:32:06 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7-userdata-shm.mount: Deactivated successfully.
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.872 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:69:d7 10.100.0.7'], port_security=['fa:16:3e:b8:69:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '03dff342-d941-4d7e-9ada-afc46435fd14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:32:06 compute-1 kernel: tap37dfce22-04 (unregistering): left promiscuous mode
Nov 24 13:32:06 compute-1 systemd[1]: var-lib-containers-storage-overlay-ec7b3a562e07a9273eb5ab5e0b29d91ef936b418a191aef416600dee8ead2d4b-merged.mount: Deactivated successfully.
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: hostname: compute-1
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: ethtool ioctl error on tap37dfce22-04: No such device
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00153|binding|INFO|Setting lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 ovn-installed in OVS
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00154|binding|INFO|Setting lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 up in Southbound
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.885 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00155|binding|INFO|Releasing lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 from this chassis (sb_readonly=1)
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00156|binding|INFO|Removing iface tap37dfce22-04 ovn-installed in OVS
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00157|if_status|INFO|Dropped 3 log messages in last 706 seconds (most recently, 706 seconds ago) due to excessive rate
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00158|if_status|INFO|Not setting lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 down as sb is readonly
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: ethtool ioctl error on tap37dfce22-04: No such device
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.888 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00159|binding|INFO|Releasing lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 from this chassis (sb_readonly=0)
Nov 24 13:32:06 compute-1 ovn_controller[95368]: 2025-11-24T13:32:06Z|00160|binding|INFO|Setting lport 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 down in Southbound
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: ethtool ioctl error on tap37dfce22-04: No such device
Nov 24 13:32:06 compute-1 podman[214360]: 2025-11-24 13:32:06.893092266 +0000 UTC m=+0.105140027 container cleanup de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.896 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:69:d7 10.100.0.7'], port_security=['fa:16:3e:b8:69:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '03dff342-d941-4d7e-9ada-afc46435fd14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: ethtool ioctl error on tap37dfce22-04: No such device
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.899 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:06 compute-1 systemd[1]: libpod-conmon-de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7.scope: Deactivated successfully.
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: ethtool ioctl error on tap37dfce22-04: No such device
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: ethtool ioctl error on tap37dfce22-04: No such device
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: ethtool ioctl error on tap37dfce22-04: No such device
Nov 24 13:32:06 compute-1 virtnodedevd[186823]: ethtool ioctl error on tap37dfce22-04: No such device
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.928 187082 INFO nova.virt.libvirt.driver [-] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Instance destroyed successfully.
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.929 187082 DEBUG nova.objects.instance [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'resources' on Instance uuid 03dff342-d941-4d7e-9ada-afc46435fd14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.942 187082 DEBUG nova.virt.libvirt.vif [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:30:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-430944557',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-430944557',id=15,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:30:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-f51vxjev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:30:47Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=03dff342-d941-4d7e-9ada-afc46435fd14,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.943 187082 DEBUG nova.network.os_vif_util [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "address": "fa:16:3e:b8:69:d7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37dfce22-04", "ovs_interfaceid": "37dfce22-04c0-4d3b-b2ab-31c6e5ce6078", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.944 187082 DEBUG nova.network.os_vif_util [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:69:d7,bridge_name='br-int',has_traffic_filtering=True,id=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37dfce22-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:32:06 compute-1 nova_compute[187078]: 2025-11-24 13:32:06.944 187082 DEBUG os_vif [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:69:d7,bridge_name='br-int',has_traffic_filtering=True,id=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37dfce22-04') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:32:06 compute-1 podman[214402]: 2025-11-24 13:32:06.958930993 +0000 UTC m=+0.043119551 container remove de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.964 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b58752-acfe-452f-be5c-63a3f52f1d51]: (4, ('Mon Nov 24 01:32:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7)\nde9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7\nMon Nov 24 01:32:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (de9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7)\nde9854842b74694225a82b20d33b070939a0c5529fd69580200728042fdc5ce7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.966 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[02041947-e3de-43b7-8958-a5264cb03079]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.967 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:06 compute-1 kernel: tapee6bf4e1-a0: left promiscuous mode
Nov 24 13:32:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:06.988 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[388773dc-7a58-4dba-8545-b054cfcc2186]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.003 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4c950644-f4f5-4d74-955e-1db08bc011de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.005 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5baf54e6-5892-4519-ac54-4efa899c92fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.027 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6e2b4e-fffe-495e-b5aa-5331c1554d5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406530, 'reachable_time': 27361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214434, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:07 compute-1 systemd[1]: run-netns-ovnmeta\x2dee6bf4e1\x2dadcd\x2d4f6c\x2d8b46\x2deaa71e64e9c0.mount: Deactivated successfully.
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.030 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.030 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[a4148b6b-92b9-42eb-8c4d-0011e3249cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.031 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.032 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.032 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4382c5a0-c3ef-42c5-b053-630c6c711333]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.033 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.034 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:32:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:07.035 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[91aa8063-96ae-497b-a184-8d29265ced76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.158 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.161 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37dfce22-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.163 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.164 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.167 187082 INFO os_vif [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:69:d7,bridge_name='br-int',has_traffic_filtering=True,id=37dfce22-04c0-4d3b-b2ab-31c6e5ce6078,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37dfce22-04')
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.167 187082 INFO nova.virt.libvirt.driver [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Deleting instance files /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14_del
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.168 187082 INFO nova.virt.libvirt.driver [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Deletion of /var/lib/nova/instances/03dff342-d941-4d7e-9ada-afc46435fd14_del complete
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.217 187082 INFO nova.compute.manager [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Took 0.58 seconds to destroy the instance on the hypervisor.
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.218 187082 DEBUG oslo.service.loopingcall [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.218 187082 DEBUG nova.compute.manager [-] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.218 187082 DEBUG nova.network.neutron [-] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.335 187082 DEBUG nova.compute.manager [req-32e788e8-6dc3-46a1-bf26-06f5ed9f9430 req-4226ed15-3a61-406c-a851-c7d42df5d9b7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Received event network-vif-plugged-909858e9-ece1-4e65-970f-cee27b0b4525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.335 187082 DEBUG oslo_concurrency.lockutils [req-32e788e8-6dc3-46a1-bf26-06f5ed9f9430 req-4226ed15-3a61-406c-a851-c7d42df5d9b7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "af2576d1-0abd-404f-a855-04e193b197e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.336 187082 DEBUG oslo_concurrency.lockutils [req-32e788e8-6dc3-46a1-bf26-06f5ed9f9430 req-4226ed15-3a61-406c-a851-c7d42df5d9b7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "af2576d1-0abd-404f-a855-04e193b197e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.336 187082 DEBUG oslo_concurrency.lockutils [req-32e788e8-6dc3-46a1-bf26-06f5ed9f9430 req-4226ed15-3a61-406c-a851-c7d42df5d9b7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "af2576d1-0abd-404f-a855-04e193b197e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.336 187082 DEBUG nova.compute.manager [req-32e788e8-6dc3-46a1-bf26-06f5ed9f9430 req-4226ed15-3a61-406c-a851-c7d42df5d9b7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] No waiting events found dispatching network-vif-plugged-909858e9-ece1-4e65-970f-cee27b0b4525 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.336 187082 WARNING nova.compute.manager [req-32e788e8-6dc3-46a1-bf26-06f5ed9f9430 req-4226ed15-3a61-406c-a851-c7d42df5d9b7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Received unexpected event network-vif-plugged-909858e9-ece1-4e65-970f-cee27b0b4525 for instance with vm_state deleted and task_state None.
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.336 187082 DEBUG nova.compute.manager [req-32e788e8-6dc3-46a1-bf26-06f5ed9f9430 req-4226ed15-3a61-406c-a851-c7d42df5d9b7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Received event network-vif-deleted-909858e9-ece1-4e65-970f-cee27b0b4525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.474 187082 DEBUG nova.compute.manager [req-f6770c49-c44c-4315-aaa0-b70fdf43667a req-ed781b7a-a40b-4e79-a536-94793ab26fcf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received event network-vif-unplugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.474 187082 DEBUG oslo_concurrency.lockutils [req-f6770c49-c44c-4315-aaa0-b70fdf43667a req-ed781b7a-a40b-4e79-a536-94793ab26fcf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.475 187082 DEBUG oslo_concurrency.lockutils [req-f6770c49-c44c-4315-aaa0-b70fdf43667a req-ed781b7a-a40b-4e79-a536-94793ab26fcf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.475 187082 DEBUG oslo_concurrency.lockutils [req-f6770c49-c44c-4315-aaa0-b70fdf43667a req-ed781b7a-a40b-4e79-a536-94793ab26fcf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.475 187082 DEBUG nova.compute.manager [req-f6770c49-c44c-4315-aaa0-b70fdf43667a req-ed781b7a-a40b-4e79-a536-94793ab26fcf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] No waiting events found dispatching network-vif-unplugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.475 187082 DEBUG nova.compute.manager [req-f6770c49-c44c-4315-aaa0-b70fdf43667a req-ed781b7a-a40b-4e79-a536-94793ab26fcf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received event network-vif-unplugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.713 187082 DEBUG nova.network.neutron [-] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.724 187082 INFO nova.compute.manager [-] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Took 0.51 seconds to deallocate network for instance.
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.757 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.758 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.798 187082 DEBUG nova.compute.provider_tree [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.810 187082 DEBUG nova.scheduler.client.report [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.826 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.848 187082 INFO nova.scheduler.client.report [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Deleted allocations for instance 03dff342-d941-4d7e-9ada-afc46435fd14
Nov 24 13:32:07 compute-1 nova_compute[187078]: 2025-11-24 13:32:07.921 187082 DEBUG oslo_concurrency.lockutils [None req-3cc9dd20-733d-4a93-91af-06766b81eec7 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:08 compute-1 nova_compute[187078]: 2025-11-24 13:32:08.300 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.411 187082 DEBUG nova.compute.manager [req-c0ea6442-ce08-45d0-a8d4-0aac1bd9fe8e req-522a7c8b-485f-4ae8-bee1-727f8924cbb0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received event network-vif-deleted-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.554 187082 DEBUG nova.compute.manager [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.554 187082 DEBUG oslo_concurrency.lockutils [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.555 187082 DEBUG oslo_concurrency.lockutils [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.555 187082 DEBUG oslo_concurrency.lockutils [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.555 187082 DEBUG nova.compute.manager [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] No waiting events found dispatching network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.555 187082 WARNING nova.compute.manager [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received unexpected event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 for instance with vm_state deleted and task_state None.
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.556 187082 DEBUG nova.compute.manager [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.557 187082 DEBUG oslo_concurrency.lockutils [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.558 187082 DEBUG oslo_concurrency.lockutils [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.558 187082 DEBUG oslo_concurrency.lockutils [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "03dff342-d941-4d7e-9ada-afc46435fd14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.558 187082 DEBUG nova.compute.manager [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] No waiting events found dispatching network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:32:09 compute-1 nova_compute[187078]: 2025-11-24 13:32:09.559 187082 WARNING nova.compute.manager [req-69606636-8d53-45e2-8069-2db66632b083 req-a63cd1dc-9b41-4ad6-a082-9c28c81656f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Received unexpected event network-vif-plugged-37dfce22-04c0-4d3b-b2ab-31c6e5ce6078 for instance with vm_state deleted and task_state None.
Nov 24 13:32:10 compute-1 nova_compute[187078]: 2025-11-24 13:32:10.665 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:11 compute-1 nova_compute[187078]: 2025-11-24 13:32:11.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:11 compute-1 sshd-session[214435]: Invalid user fred from 5.198.176.28 port 44764
Nov 24 13:32:12 compute-1 sshd-session[214435]: Received disconnect from 5.198.176.28 port 44764:11: Bye Bye [preauth]
Nov 24 13:32:12 compute-1 sshd-session[214435]: Disconnected from invalid user fred 5.198.176.28 port 44764 [preauth]
Nov 24 13:32:12 compute-1 nova_compute[187078]: 2025-11-24 13:32:12.164 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:13 compute-1 nova_compute[187078]: 2025-11-24 13:32:13.301 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:13 compute-1 podman[214437]: 2025-11-24 13:32:13.555275394 +0000 UTC m=+0.095578637 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:32:13 compute-1 podman[214438]: 2025-11-24 13:32:13.567888896 +0000 UTC m=+0.108196870 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:32:13 compute-1 nova_compute[187078]: 2025-11-24 13:32:13.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.687 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.688 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.879 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.880 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5841MB free_disk=73.46011734008789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.880 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.881 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.932 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.933 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.953 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.965 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.987 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:32:14 compute-1 nova_compute[187078]: 2025-11-24 13:32:14.987 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:17 compute-1 nova_compute[187078]: 2025-11-24 13:32:17.167 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:17 compute-1 nova_compute[187078]: 2025-11-24 13:32:17.988 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:17 compute-1 nova_compute[187078]: 2025-11-24 13:32:17.989 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:32:17 compute-1 nova_compute[187078]: 2025-11-24 13:32:17.989 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:32:18 compute-1 nova_compute[187078]: 2025-11-24 13:32:18.005 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:32:18 compute-1 nova_compute[187078]: 2025-11-24 13:32:18.005 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:18 compute-1 nova_compute[187078]: 2025-11-24 13:32:18.006 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:18 compute-1 nova_compute[187078]: 2025-11-24 13:32:18.303 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:19 compute-1 openstack_network_exporter[199599]: ERROR   13:32:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:32:19 compute-1 openstack_network_exporter[199599]: ERROR   13:32:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:32:19 compute-1 openstack_network_exporter[199599]: ERROR   13:32:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:32:19 compute-1 openstack_network_exporter[199599]: ERROR   13:32:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:32:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:32:19 compute-1 openstack_network_exporter[199599]: ERROR   13:32:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:32:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:32:19 compute-1 nova_compute[187078]: 2025-11-24 13:32:19.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:19 compute-1 nova_compute[187078]: 2025-11-24 13:32:19.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:32:19 compute-1 nova_compute[187078]: 2025-11-24 13:32:19.952 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991124.9508085, af2576d1-0abd-404f-a855-04e193b197e3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:32:19 compute-1 nova_compute[187078]: 2025-11-24 13:32:19.953 187082 INFO nova.compute.manager [-] [instance: af2576d1-0abd-404f-a855-04e193b197e3] VM Stopped (Lifecycle Event)
Nov 24 13:32:19 compute-1 nova_compute[187078]: 2025-11-24 13:32:19.977 187082 DEBUG nova.compute.manager [None req-fd83c92e-ec31-4655-a91f-f4d26f48ae87 - - - - - -] [instance: af2576d1-0abd-404f-a855-04e193b197e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:32:20 compute-1 podman[214482]: 2025-11-24 13:32:20.531257909 +0000 UTC m=+0.073134997 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 24 13:32:20 compute-1 podman[214483]: 2025-11-24 13:32:20.600798788 +0000 UTC m=+0.133798335 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 13:32:20 compute-1 nova_compute[187078]: 2025-11-24 13:32:20.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:21 compute-1 nova_compute[187078]: 2025-11-24 13:32:21.927 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991126.924777, 03dff342-d941-4d7e-9ada-afc46435fd14 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:32:21 compute-1 nova_compute[187078]: 2025-11-24 13:32:21.928 187082 INFO nova.compute.manager [-] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] VM Stopped (Lifecycle Event)
Nov 24 13:32:21 compute-1 nova_compute[187078]: 2025-11-24 13:32:21.958 187082 DEBUG nova.compute.manager [None req-1b7d1fd7-a7c1-4b56-aeb7-92a40ad88049 - - - - - -] [instance: 03dff342-d941-4d7e-9ada-afc46435fd14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:32:22 compute-1 nova_compute[187078]: 2025-11-24 13:32:22.168 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:23 compute-1 nova_compute[187078]: 2025-11-24 13:32:23.306 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:23 compute-1 nova_compute[187078]: 2025-11-24 13:32:23.661 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:32:27 compute-1 nova_compute[187078]: 2025-11-24 13:32:27.171 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:28 compute-1 nova_compute[187078]: 2025-11-24 13:32:28.308 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:29 compute-1 podman[214529]: 2025-11-24 13:32:29.563650477 +0000 UTC m=+0.100243804 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Nov 24 13:32:32 compute-1 nova_compute[187078]: 2025-11-24 13:32:32.175 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:33 compute-1 nova_compute[187078]: 2025-11-24 13:32:33.311 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:35 compute-1 podman[197429]: time="2025-11-24T13:32:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:32:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:32:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:32:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:32:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Nov 24 13:32:36 compute-1 sshd-session[214551]: Invalid user zjw from 176.114.89.34 port 55378
Nov 24 13:32:37 compute-1 sshd-session[214551]: Received disconnect from 176.114.89.34 port 55378:11: Bye Bye [preauth]
Nov 24 13:32:37 compute-1 sshd-session[214551]: Disconnected from invalid user zjw 176.114.89.34 port 55378 [preauth]
Nov 24 13:32:37 compute-1 nova_compute[187078]: 2025-11-24 13:32:37.177 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:37 compute-1 ovn_controller[95368]: 2025-11-24T13:32:37Z|00161|memory_trim|INFO|Detected inactivity (last active 30019 ms ago): trimming memory
Nov 24 13:32:38 compute-1 nova_compute[187078]: 2025-11-24 13:32:38.315 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:41 compute-1 sshd-session[214553]: Invalid user roott from 85.209.134.43 port 39178
Nov 24 13:32:41 compute-1 sshd-session[214553]: Received disconnect from 85.209.134.43 port 39178:11: Bye Bye [preauth]
Nov 24 13:32:41 compute-1 sshd-session[214553]: Disconnected from invalid user roott 85.209.134.43 port 39178 [preauth]
Nov 24 13:32:42 compute-1 nova_compute[187078]: 2025-11-24 13:32:42.178 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:43 compute-1 nova_compute[187078]: 2025-11-24 13:32:43.317 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.088 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.089 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.102 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.169 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.170 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.177 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.177 187082 INFO nova.compute.claims [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.282 187082 DEBUG nova.compute.provider_tree [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.294 187082 DEBUG nova.scheduler.client.report [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.308 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.309 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.359 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.360 187082 DEBUG nova.network.neutron [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.378 187082 INFO nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.394 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:32:44 compute-1 podman[214555]: 2025-11-24 13:32:44.523749303 +0000 UTC m=+0.072543022 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:32:44 compute-1 podman[214556]: 2025-11-24 13:32:44.531517564 +0000 UTC m=+0.069933021 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.581 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.582 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.583 187082 INFO nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Creating image(s)
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.583 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "/var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.584 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.584 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.597 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.680 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.681 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.682 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.694 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.765 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.767 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.809 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.811 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.812 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.875 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.876 187082 DEBUG nova.virt.disk.api [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Checking if we can resize image /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.877 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.942 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.943 187082 DEBUG nova.virt.disk.api [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Cannot resize image /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.943 187082 DEBUG nova.objects.instance [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'migration_context' on Instance uuid df6b16e5-93cd-49e7-a360-268fee816249 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.955 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.956 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Ensure instance console log exists: /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.956 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.956 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:44 compute-1 nova_compute[187078]: 2025-11-24 13:32:44.957 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:45 compute-1 nova_compute[187078]: 2025-11-24 13:32:45.035 187082 DEBUG nova.policy [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44609a4d2fa941a4b26d6b27a5d4a6d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a66bcdc071b741ef8709a4608acd6051', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:32:45 compute-1 nova_compute[187078]: 2025-11-24 13:32:45.505 187082 DEBUG nova.network.neutron [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Successfully created port: df82121b-bc7d-43b5-8380-1339b831f456 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:32:46 compute-1 nova_compute[187078]: 2025-11-24 13:32:46.141 187082 DEBUG nova.network.neutron [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Successfully updated port: df82121b-bc7d-43b5-8380-1339b831f456 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:32:46 compute-1 nova_compute[187078]: 2025-11-24 13:32:46.167 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:32:46 compute-1 nova_compute[187078]: 2025-11-24 13:32:46.167 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquired lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:32:46 compute-1 nova_compute[187078]: 2025-11-24 13:32:46.168 187082 DEBUG nova.network.neutron [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:32:46 compute-1 nova_compute[187078]: 2025-11-24 13:32:46.243 187082 DEBUG nova.compute.manager [req-1f1d295f-c8bb-4a71-b821-e4f3da2c3cec req-f275c989-4707-4bca-bcd9-0661ae8eed99 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-changed-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:46 compute-1 nova_compute[187078]: 2025-11-24 13:32:46.244 187082 DEBUG nova.compute.manager [req-1f1d295f-c8bb-4a71-b821-e4f3da2c3cec req-f275c989-4707-4bca-bcd9-0661ae8eed99 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Refreshing instance network info cache due to event network-changed-df82121b-bc7d-43b5-8380-1339b831f456. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:32:46 compute-1 nova_compute[187078]: 2025-11-24 13:32:46.244 187082 DEBUG oslo_concurrency.lockutils [req-1f1d295f-c8bb-4a71-b821-e4f3da2c3cec req-f275c989-4707-4bca-bcd9-0661ae8eed99 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:32:46 compute-1 nova_compute[187078]: 2025-11-24 13:32:46.303 187082 DEBUG nova.network.neutron [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:32:47 compute-1 nova_compute[187078]: 2025-11-24 13:32:47.181 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.319 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.566 187082 DEBUG nova.network.neutron [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Updating instance_info_cache with network_info: [{"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.582 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Releasing lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.583 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Instance network_info: |[{"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.583 187082 DEBUG oslo_concurrency.lockutils [req-1f1d295f-c8bb-4a71-b821-e4f3da2c3cec req-f275c989-4707-4bca-bcd9-0661ae8eed99 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.584 187082 DEBUG nova.network.neutron [req-1f1d295f-c8bb-4a71-b821-e4f3da2c3cec req-f275c989-4707-4bca-bcd9-0661ae8eed99 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Refreshing network info cache for port df82121b-bc7d-43b5-8380-1339b831f456 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.587 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Start _get_guest_xml network_info=[{"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.592 187082 WARNING nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.600 187082 DEBUG nova.virt.libvirt.host [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.601 187082 DEBUG nova.virt.libvirt.host [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.604 187082 DEBUG nova.virt.libvirt.host [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.604 187082 DEBUG nova.virt.libvirt.host [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.605 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.606 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.606 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.606 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.607 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.607 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.607 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.607 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.607 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.608 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.608 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.608 187082 DEBUG nova.virt.hardware [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.611 187082 DEBUG nova.virt.libvirt.vif [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:32:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-890041496',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-890041496',id=17,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-scejrrau',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:32:44Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=df6b16e5-93cd-49e7-a360-268fee816249,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.611 187082 DEBUG nova.network.os_vif_util [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.612 187082 DEBUG nova.network.os_vif_util [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:8b:e7,bridge_name='br-int',has_traffic_filtering=True,id=df82121b-bc7d-43b5-8380-1339b831f456,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf82121b-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.613 187082 DEBUG nova.objects.instance [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'pci_devices' on Instance uuid df6b16e5-93cd-49e7-a360-268fee816249 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.624 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <uuid>df6b16e5-93cd-49e7-a360-268fee816249</uuid>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <name>instance-00000011</name>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteStrategies-server-890041496</nova:name>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:32:48</nova:creationTime>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:32:48 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:32:48 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:32:48 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:32:48 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:32:48 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:32:48 compute-1 nova_compute[187078]:         <nova:user uuid="44609a4d2fa941a4b26d6b27a5d4a6d2">tempest-TestExecuteStrategies-392394962-project-member</nova:user>
Nov 24 13:32:48 compute-1 nova_compute[187078]:         <nova:project uuid="a66bcdc071b741ef8709a4608acd6051">tempest-TestExecuteStrategies-392394962</nova:project>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:32:48 compute-1 nova_compute[187078]:         <nova:port uuid="df82121b-bc7d-43b5-8380-1339b831f456">
Nov 24 13:32:48 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <system>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <entry name="serial">df6b16e5-93cd-49e7-a360-268fee816249</entry>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <entry name="uuid">df6b16e5-93cd-49e7-a360-268fee816249</entry>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </system>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <os>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   </os>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <features>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   </features>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk.config"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:c2:8b:e7"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <target dev="tapdf82121b-bc"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/console.log" append="off"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <video>
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </video>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:32:48 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:32:48 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:32:48 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:32:48 compute-1 nova_compute[187078]: </domain>
Nov 24 13:32:48 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.626 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Preparing to wait for external event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.627 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.627 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.628 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.629 187082 DEBUG nova.virt.libvirt.vif [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:32:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-890041496',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-890041496',id=17,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-scejrrau',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:32:44Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=df6b16e5-93cd-49e7-a360-268fee816249,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.630 187082 DEBUG nova.network.os_vif_util [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.631 187082 DEBUG nova.network.os_vif_util [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:8b:e7,bridge_name='br-int',has_traffic_filtering=True,id=df82121b-bc7d-43b5-8380-1339b831f456,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf82121b-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.632 187082 DEBUG os_vif [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:8b:e7,bridge_name='br-int',has_traffic_filtering=True,id=df82121b-bc7d-43b5-8380-1339b831f456,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf82121b-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.633 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.634 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.634 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.640 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.640 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf82121b-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.641 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf82121b-bc, col_values=(('external_ids', {'iface-id': 'df82121b-bc7d-43b5-8380-1339b831f456', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:8b:e7', 'vm-uuid': 'df6b16e5-93cd-49e7-a360-268fee816249'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.674 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:48 compute-1 NetworkManager[55527]: <info>  [1763991168.6761] manager: (tapdf82121b-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.680 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.683 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.685 187082 INFO os_vif [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:8b:e7,bridge_name='br-int',has_traffic_filtering=True,id=df82121b-bc7d-43b5-8380-1339b831f456,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf82121b-bc')
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.732 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.733 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.733 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No VIF found with MAC fa:16:3e:c2:8b:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:32:48 compute-1 nova_compute[187078]: 2025-11-24 13:32:48.734 187082 INFO nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Using config drive
Nov 24 13:32:49 compute-1 openstack_network_exporter[199599]: ERROR   13:32:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:32:49 compute-1 openstack_network_exporter[199599]: ERROR   13:32:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:32:49 compute-1 openstack_network_exporter[199599]: ERROR   13:32:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:32:49 compute-1 openstack_network_exporter[199599]: ERROR   13:32:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:32:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:32:49 compute-1 openstack_network_exporter[199599]: ERROR   13:32:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:32:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:32:49 compute-1 nova_compute[187078]: 2025-11-24 13:32:49.562 187082 INFO nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Creating config drive at /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk.config
Nov 24 13:32:49 compute-1 nova_compute[187078]: 2025-11-24 13:32:49.573 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn0k6kpt2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:32:49 compute-1 nova_compute[187078]: 2025-11-24 13:32:49.707 187082 DEBUG oslo_concurrency.processutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn0k6kpt2" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:32:49 compute-1 kernel: tapdf82121b-bc: entered promiscuous mode
Nov 24 13:32:49 compute-1 NetworkManager[55527]: <info>  [1763991169.7758] manager: (tapdf82121b-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Nov 24 13:32:49 compute-1 ovn_controller[95368]: 2025-11-24T13:32:49Z|00162|binding|INFO|Claiming lport df82121b-bc7d-43b5-8380-1339b831f456 for this chassis.
Nov 24 13:32:49 compute-1 ovn_controller[95368]: 2025-11-24T13:32:49Z|00163|binding|INFO|df82121b-bc7d-43b5-8380-1339b831f456: Claiming fa:16:3e:c2:8b:e7 10.100.0.7
Nov 24 13:32:49 compute-1 nova_compute[187078]: 2025-11-24 13:32:49.778 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.784 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:8b:e7 10.100.0.7'], port_security=['fa:16:3e:c2:8b:e7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'df6b16e5-93cd-49e7-a360-268fee816249', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=df82121b-bc7d-43b5-8380-1339b831f456) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.785 104225 INFO neutron.agent.ovn.metadata.agent [-] Port df82121b-bc7d-43b5-8380-1339b831f456 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.787 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.797 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[637161e6-3bdb-4a4a-b013-bf89fc4202bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.798 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee6bf4e1-a1 in ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.803 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee6bf4e1-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.803 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad6af5e-5089-4e8e-96fd-9ff413ffc98f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.804 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[75b0f618-8b24-4006-b6ea-3a19ff4b6e6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 ovn_controller[95368]: 2025-11-24T13:32:49Z|00164|binding|INFO|Setting lport df82121b-bc7d-43b5-8380-1339b831f456 up in Southbound
Nov 24 13:32:49 compute-1 ovn_controller[95368]: 2025-11-24T13:32:49Z|00165|binding|INFO|Setting lport df82121b-bc7d-43b5-8380-1339b831f456 ovn-installed in OVS
Nov 24 13:32:49 compute-1 nova_compute[187078]: 2025-11-24 13:32:49.807 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:49 compute-1 systemd-udevd[214631]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:32:49 compute-1 nova_compute[187078]: 2025-11-24 13:32:49.814 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.823 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[5b1d36a4-7218-4f17-9dcd-b4b9ce6bb083]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 NetworkManager[55527]: <info>  [1763991169.8268] device (tapdf82121b-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:32:49 compute-1 NetworkManager[55527]: <info>  [1763991169.8276] device (tapdf82121b-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:32:49 compute-1 systemd-machined[153355]: New machine qemu-13-instance-00000011.
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.848 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[91c25537-ff26-4c0a-a9f3-13f7c65b5e26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.876 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[6136bc2c-5585-43cb-a4da-78ef4e118416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 NetworkManager[55527]: <info>  [1763991169.8826] manager: (tapee6bf4e1-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.881 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e428074d-4e5b-4d71-b21d-b7a347ee6d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.931 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[b1011bbd-f617-40bd-9bc0-606a704e7d20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.936 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[980dc53f-6ab2-41fe-89b8-e5ebb8738025]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:49 compute-1 NetworkManager[55527]: <info>  [1763991169.9706] device (tapee6bf4e1-a0): carrier: link connected
Nov 24 13:32:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:49.980 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[3c6cdcb2-3715-4ead-b775-d6bc9031e7a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.001 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c9bcda-bb0a-42c4-8cdf-ef8773c75c4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418863, 'reachable_time': 23639, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214666, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.009 187082 DEBUG nova.compute.manager [req-aeefc9d9-3586-4ec1-a020-43c672bc8044 req-f741f0bf-2c09-45da-9c47-6053d725d510 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.010 187082 DEBUG oslo_concurrency.lockutils [req-aeefc9d9-3586-4ec1-a020-43c672bc8044 req-f741f0bf-2c09-45da-9c47-6053d725d510 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.010 187082 DEBUG oslo_concurrency.lockutils [req-aeefc9d9-3586-4ec1-a020-43c672bc8044 req-f741f0bf-2c09-45da-9c47-6053d725d510 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.010 187082 DEBUG oslo_concurrency.lockutils [req-aeefc9d9-3586-4ec1-a020-43c672bc8044 req-f741f0bf-2c09-45da-9c47-6053d725d510 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.011 187082 DEBUG nova.compute.manager [req-aeefc9d9-3586-4ec1-a020-43c672bc8044 req-f741f0bf-2c09-45da-9c47-6053d725d510 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Processing event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.025 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7ce112-868c-472c-934d-d2fc061b10c3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:5bc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418863, 'tstamp': 418863}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214667, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.047 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[cf9a74e5-0a53-4387-8e46-beda78d40645]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418863, 'reachable_time': 23639, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214668, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.096 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[41463a58-2353-4283-8d29-f1f208a5c9c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.139 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.139 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.199 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5761cb99-c170-4df7-acce-bcaa565ba760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.201 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.202 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.202 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.204 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:50 compute-1 NetworkManager[55527]: <info>  [1763991170.2050] manager: (tapee6bf4e1-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 24 13:32:50 compute-1 kernel: tapee6bf4e1-a0: entered promiscuous mode
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.208 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.209 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.211 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:50 compute-1 ovn_controller[95368]: 2025-11-24T13:32:50Z|00166|binding|INFO|Releasing lport 3f7bb31c-e9f4-4c4a-ad4a-8451f233926d from this chassis (sb_readonly=0)
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.227 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.227 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.232 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0c16f3a2-f0e1-48ad-956b-dc5156661489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.233 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.235 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'env', 'PROCESS_TAG=haproxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.274 187082 DEBUG nova.network.neutron [req-1f1d295f-c8bb-4a71-b821-e4f3da2c3cec req-f275c989-4707-4bca-bcd9-0661ae8eed99 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Updated VIF entry in instance network info cache for port df82121b-bc7d-43b5-8380-1339b831f456. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.274 187082 DEBUG nova.network.neutron [req-1f1d295f-c8bb-4a71-b821-e4f3da2c3cec req-f275c989-4707-4bca-bcd9-0661ae8eed99 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Updating instance_info_cache with network_info: [{"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.291 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991170.2912312, df6b16e5-93cd-49e7-a360-268fee816249 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.292 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] VM Started (Lifecycle Event)
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.296 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.299 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.304 187082 INFO nova.virt.libvirt.driver [-] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Instance spawned successfully.
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.305 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.335 187082 DEBUG oslo_concurrency.lockutils [req-1f1d295f-c8bb-4a71-b821-e4f3da2c3cec req-f275c989-4707-4bca-bcd9-0661ae8eed99 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.342 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.343 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.344 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.344 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.345 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.346 187082 DEBUG nova.virt.libvirt.driver [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.404 187082 INFO nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Took 5.82 seconds to spawn the instance on the hypervisor.
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.405 187082 DEBUG nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.443 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.450 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.467 187082 INFO nova.compute.manager [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Took 6.32 seconds to build instance.
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.470 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991170.2915123, df6b16e5-93cd-49e7-a360-268fee816249 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.470 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] VM Paused (Lifecycle Event)
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.494 187082 DEBUG oslo_concurrency.lockutils [None req-552fe934-c8a4-487c-8205-dea36a239d2b 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.497 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.500 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991170.2990758, df6b16e5-93cd-49e7-a360-268fee816249 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.501 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] VM Resumed (Lifecycle Event)
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.516 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:32:50 compute-1 nova_compute[187078]: 2025-11-24 13:32:50.519 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:32:50 compute-1 podman[214708]: 2025-11-24 13:32:50.659642469 +0000 UTC m=+0.063739673 container create 24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 24 13:32:50 compute-1 systemd[1]: Started libpod-conmon-24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad.scope.
Nov 24 13:32:50 compute-1 podman[214708]: 2025-11-24 13:32:50.626403746 +0000 UTC m=+0.030501050 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:32:50 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:32:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a51f16aaf78b7f8f1939b431baab81b71f71266017a763d31dba7d2dec9777fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:32:50 compute-1 podman[214708]: 2025-11-24 13:32:50.767155299 +0000 UTC m=+0.171252523 container init 24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:32:50 compute-1 podman[214708]: 2025-11-24 13:32:50.774663993 +0000 UTC m=+0.178761187 container start 24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 13:32:50 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[214730]: [NOTICE]   (214761) : New worker (214769) forked
Nov 24 13:32:50 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[214730]: [NOTICE]   (214761) : Loading success.
Nov 24 13:32:50 compute-1 podman[214721]: 2025-11-24 13:32:50.811856824 +0000 UTC m=+0.093917693 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:32:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:50.843 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:32:50 compute-1 podman[214724]: 2025-11-24 13:32:50.870130267 +0000 UTC m=+0.139234484 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:32:52 compute-1 nova_compute[187078]: 2025-11-24 13:32:52.087 187082 DEBUG nova.compute.manager [req-1cc8a694-7148-4550-8bef-777349d045eb req-a4e5d699-494c-41e2-af90-47917a612daf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:32:52 compute-1 nova_compute[187078]: 2025-11-24 13:32:52.088 187082 DEBUG oslo_concurrency.lockutils [req-1cc8a694-7148-4550-8bef-777349d045eb req-a4e5d699-494c-41e2-af90-47917a612daf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:32:52 compute-1 nova_compute[187078]: 2025-11-24 13:32:52.089 187082 DEBUG oslo_concurrency.lockutils [req-1cc8a694-7148-4550-8bef-777349d045eb req-a4e5d699-494c-41e2-af90-47917a612daf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:32:52 compute-1 nova_compute[187078]: 2025-11-24 13:32:52.089 187082 DEBUG oslo_concurrency.lockutils [req-1cc8a694-7148-4550-8bef-777349d045eb req-a4e5d699-494c-41e2-af90-47917a612daf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:32:52 compute-1 nova_compute[187078]: 2025-11-24 13:32:52.089 187082 DEBUG nova.compute.manager [req-1cc8a694-7148-4550-8bef-777349d045eb req-a4e5d699-494c-41e2-af90-47917a612daf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:32:52 compute-1 nova_compute[187078]: 2025-11-24 13:32:52.090 187082 WARNING nova.compute.manager [req-1cc8a694-7148-4550-8bef-777349d045eb req-a4e5d699-494c-41e2-af90-47917a612daf 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received unexpected event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with vm_state active and task_state None.
Nov 24 13:32:53 compute-1 nova_compute[187078]: 2025-11-24 13:32:53.323 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:53 compute-1 nova_compute[187078]: 2025-11-24 13:32:53.674 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:55 compute-1 sshd-session[214783]: Invalid user autrede from 175.100.24.139 port 37218
Nov 24 13:32:55 compute-1 sshd-session[214783]: Received disconnect from 175.100.24.139 port 37218:11: Bye Bye [preauth]
Nov 24 13:32:55 compute-1 sshd-session[214783]: Disconnected from invalid user autrede 175.100.24.139 port 37218 [preauth]
Nov 24 13:32:58 compute-1 nova_compute[187078]: 2025-11-24 13:32:58.326 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:58 compute-1 nova_compute[187078]: 2025-11-24 13:32:58.677 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:32:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:32:59.845 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:33:00 compute-1 podman[214785]: 2025-11-24 13:33:00.530909162 +0000 UTC m=+0.073903168 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 24 13:33:03 compute-1 nova_compute[187078]: 2025-11-24 13:33:03.335 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:03 compute-1 nova_compute[187078]: 2025-11-24 13:33:03.679 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:03 compute-1 ovn_controller[95368]: 2025-11-24T13:33:03Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:8b:e7 10.100.0.7
Nov 24 13:33:03 compute-1 ovn_controller[95368]: 2025-11-24T13:33:03Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:8b:e7 10.100.0.7
Nov 24 13:33:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:04.161 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:04.162 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:04.163 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:05 compute-1 podman[197429]: time="2025-11-24T13:33:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:33:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:33:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:33:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:33:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Nov 24 13:33:07 compute-1 sshd-session[214817]: Invalid user ts3 from 68.183.82.237 port 59848
Nov 24 13:33:08 compute-1 sshd-session[214817]: Received disconnect from 68.183.82.237 port 59848:11: Bye Bye [preauth]
Nov 24 13:33:08 compute-1 sshd-session[214817]: Disconnected from invalid user ts3 68.183.82.237 port 59848 [preauth]
Nov 24 13:33:08 compute-1 nova_compute[187078]: 2025-11-24 13:33:08.336 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:08 compute-1 nova_compute[187078]: 2025-11-24 13:33:08.681 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:10 compute-1 nova_compute[187078]: 2025-11-24 13:33:10.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:12 compute-1 nova_compute[187078]: 2025-11-24 13:33:12.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:13 compute-1 nova_compute[187078]: 2025-11-24 13:33:13.372 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:13 compute-1 nova_compute[187078]: 2025-11-24 13:33:13.683 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:15 compute-1 podman[214819]: 2025-11-24 13:33:15.494370069 +0000 UTC m=+0.046131435 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:33:15 compute-1 podman[214820]: 2025-11-24 13:33:15.526622684 +0000 UTC m=+0.074682139 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Nov 24 13:33:15 compute-1 nova_compute[187078]: 2025-11-24 13:33:15.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.708 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.708 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.708 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.709 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.777 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.844 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.845 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:33:16 compute-1 nova_compute[187078]: 2025-11-24 13:33:16.901 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.071 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.073 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5671MB free_disk=73.43085861206055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.073 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.074 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.166 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance df6b16e5-93cd-49e7-a360-268fee816249 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.166 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.167 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.209 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.222 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.240 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:33:17 compute-1 nova_compute[187078]: 2025-11-24 13:33:17.240 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:18 compute-1 nova_compute[187078]: 2025-11-24 13:33:18.241 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:18 compute-1 nova_compute[187078]: 2025-11-24 13:33:18.376 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:18 compute-1 nova_compute[187078]: 2025-11-24 13:33:18.685 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:19 compute-1 openstack_network_exporter[199599]: ERROR   13:33:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:33:19 compute-1 openstack_network_exporter[199599]: ERROR   13:33:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:33:19 compute-1 openstack_network_exporter[199599]: ERROR   13:33:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:33:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:33:19 compute-1 openstack_network_exporter[199599]: ERROR   13:33:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:33:19 compute-1 openstack_network_exporter[199599]: ERROR   13:33:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:33:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:33:19 compute-1 nova_compute[187078]: 2025-11-24 13:33:19.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:19 compute-1 nova_compute[187078]: 2025-11-24 13:33:19.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:33:19 compute-1 nova_compute[187078]: 2025-11-24 13:33:19.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:33:19 compute-1 nova_compute[187078]: 2025-11-24 13:33:19.830 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:33:19 compute-1 nova_compute[187078]: 2025-11-24 13:33:19.830 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:33:19 compute-1 nova_compute[187078]: 2025-11-24 13:33:19.831 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:33:19 compute-1 nova_compute[187078]: 2025-11-24 13:33:19.831 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid df6b16e5-93cd-49e7-a360-268fee816249 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:33:20 compute-1 sshd-session[214867]: Invalid user sol from 45.148.10.240 port 39282
Nov 24 13:33:20 compute-1 sshd-session[214867]: Connection closed by invalid user sol 45.148.10.240 port 39282 [preauth]
Nov 24 13:33:20 compute-1 nova_compute[187078]: 2025-11-24 13:33:20.942 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Updating instance_info_cache with network_info: [{"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:33:20 compute-1 nova_compute[187078]: 2025-11-24 13:33:20.959 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:33:20 compute-1 nova_compute[187078]: 2025-11-24 13:33:20.960 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:33:20 compute-1 nova_compute[187078]: 2025-11-24 13:33:20.960 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:21 compute-1 sshd-session[214869]: Invalid user halo from 5.198.176.28 port 44870
Nov 24 13:33:21 compute-1 podman[214871]: 2025-11-24 13:33:21.26788246 +0000 UTC m=+0.050034921 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 13:33:21 compute-1 podman[214872]: 2025-11-24 13:33:21.295328475 +0000 UTC m=+0.073953550 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 24 13:33:21 compute-1 sshd-session[214869]: Received disconnect from 5.198.176.28 port 44870:11: Bye Bye [preauth]
Nov 24 13:33:21 compute-1 sshd-session[214869]: Disconnected from invalid user halo 5.198.176.28 port 44870 [preauth]
Nov 24 13:33:21 compute-1 nova_compute[187078]: 2025-11-24 13:33:21.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:21 compute-1 nova_compute[187078]: 2025-11-24 13:33:21.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:33:22 compute-1 nova_compute[187078]: 2025-11-24 13:33:22.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:33:23 compute-1 nova_compute[187078]: 2025-11-24 13:33:23.414 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:23 compute-1 nova_compute[187078]: 2025-11-24 13:33:23.688 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:28 compute-1 nova_compute[187078]: 2025-11-24 13:33:28.417 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:28 compute-1 nova_compute[187078]: 2025-11-24 13:33:28.689 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:29 compute-1 ovn_controller[95368]: 2025-11-24T13:33:29Z|00167|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Nov 24 13:33:31 compute-1 podman[214920]: 2025-11-24 13:33:31.515189348 +0000 UTC m=+0.067107714 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 24 13:33:33 compute-1 nova_compute[187078]: 2025-11-24 13:33:33.423 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:33 compute-1 nova_compute[187078]: 2025-11-24 13:33:33.691 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:35 compute-1 podman[197429]: time="2025-11-24T13:33:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:33:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:33:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:33:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:33:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Nov 24 13:33:36 compute-1 sshd-session[214918]: Connection closed by 45.78.217.131 port 56652 [preauth]
Nov 24 13:33:38 compute-1 nova_compute[187078]: 2025-11-24 13:33:38.426 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:38 compute-1 nova_compute[187078]: 2025-11-24 13:33:38.693 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:40 compute-1 sshd-session[214946]: Invalid user vtatis from 85.209.134.43 port 40062
Nov 24 13:33:40 compute-1 sshd-session[214946]: Received disconnect from 85.209.134.43 port 40062:11: Bye Bye [preauth]
Nov 24 13:33:40 compute-1 sshd-session[214946]: Disconnected from invalid user vtatis 85.209.134.43 port 40062 [preauth]
Nov 24 13:33:43 compute-1 nova_compute[187078]: 2025-11-24 13:33:43.467 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:43 compute-1 nova_compute[187078]: 2025-11-24 13:33:43.696 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:43 compute-1 sshd-session[214948]: Invalid user autrede from 176.114.89.34 port 56560
Nov 24 13:33:43 compute-1 sshd-session[214948]: Received disconnect from 176.114.89.34 port 56560:11: Bye Bye [preauth]
Nov 24 13:33:43 compute-1 sshd-session[214948]: Disconnected from invalid user autrede 176.114.89.34 port 56560 [preauth]
Nov 24 13:33:46 compute-1 podman[214951]: 2025-11-24 13:33:46.501251319 +0000 UTC m=+0.049201497 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:33:46 compute-1 podman[214950]: 2025-11-24 13:33:46.502475503 +0000 UTC m=+0.053361440 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:33:48 compute-1 nova_compute[187078]: 2025-11-24 13:33:48.469 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:48 compute-1 nova_compute[187078]: 2025-11-24 13:33:48.698 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:49 compute-1 openstack_network_exporter[199599]: ERROR   13:33:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:33:49 compute-1 openstack_network_exporter[199599]: ERROR   13:33:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:33:49 compute-1 openstack_network_exporter[199599]: ERROR   13:33:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:33:49 compute-1 openstack_network_exporter[199599]: ERROR   13:33:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:33:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:33:49 compute-1 openstack_network_exporter[199599]: ERROR   13:33:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:33:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:33:50 compute-1 nova_compute[187078]: 2025-11-24 13:33:50.500 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Check if temp file /var/lib/nova/instances/tmpq4514j18 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 24 13:33:50 compute-1 nova_compute[187078]: 2025-11-24 13:33:50.500 187082 DEBUG nova.compute.manager [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq4514j18',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='df6b16e5-93cd-49e7-a360-268fee816249',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 24 13:33:51 compute-1 podman[214993]: 2025-11-24 13:33:51.507657834 +0000 UTC m=+0.055024855 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 13:33:51 compute-1 podman[214994]: 2025-11-24 13:33:51.539473409 +0000 UTC m=+0.081041432 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 13:33:51 compute-1 nova_compute[187078]: 2025-11-24 13:33:51.861 187082 DEBUG oslo_concurrency.processutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:33:51 compute-1 nova_compute[187078]: 2025-11-24 13:33:51.922 187082 DEBUG oslo_concurrency.processutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:33:51 compute-1 nova_compute[187078]: 2025-11-24 13:33:51.924 187082 DEBUG oslo_concurrency.processutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:33:52 compute-1 nova_compute[187078]: 2025-11-24 13:33:51.999 187082 DEBUG oslo_concurrency.processutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:33:52 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 24 13:33:53 compute-1 nova_compute[187078]: 2025-11-24 13:33:53.470 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:53 compute-1 sshd-session[215048]: Accepted publickey for nova from 192.168.122.100 port 58570 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:33:53 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Nov 24 13:33:53 compute-1 nova_compute[187078]: 2025-11-24 13:33:53.699 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:53 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 24 13:33:53 compute-1 systemd-logind[815]: New session 41 of user nova.
Nov 24 13:33:53 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 24 13:33:53 compute-1 systemd[1]: Starting User Manager for UID 42436...
Nov 24 13:33:53 compute-1 systemd[215052]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:33:53 compute-1 systemd[215052]: Queued start job for default target Main User Target.
Nov 24 13:33:53 compute-1 systemd[215052]: Created slice User Application Slice.
Nov 24 13:33:53 compute-1 systemd[215052]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:33:53 compute-1 systemd[215052]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:33:53 compute-1 systemd[215052]: Reached target Paths.
Nov 24 13:33:53 compute-1 systemd[215052]: Reached target Timers.
Nov 24 13:33:53 compute-1 systemd[215052]: Starting D-Bus User Message Bus Socket...
Nov 24 13:33:53 compute-1 systemd[215052]: Starting Create User's Volatile Files and Directories...
Nov 24 13:33:53 compute-1 systemd[215052]: Finished Create User's Volatile Files and Directories.
Nov 24 13:33:53 compute-1 systemd[215052]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:33:53 compute-1 systemd[215052]: Reached target Sockets.
Nov 24 13:33:53 compute-1 systemd[215052]: Reached target Basic System.
Nov 24 13:33:53 compute-1 systemd[215052]: Reached target Main User Target.
Nov 24 13:33:53 compute-1 systemd[215052]: Startup finished in 147ms.
Nov 24 13:33:53 compute-1 systemd[1]: Started User Manager for UID 42436.
Nov 24 13:33:53 compute-1 systemd[1]: Started Session 41 of User nova.
Nov 24 13:33:53 compute-1 sshd-session[215048]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:33:54 compute-1 sshd-session[215067]: Received disconnect from 192.168.122.100 port 58570:11: disconnected by user
Nov 24 13:33:54 compute-1 sshd-session[215067]: Disconnected from user nova 192.168.122.100 port 58570
Nov 24 13:33:54 compute-1 sshd-session[215048]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:33:54 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Nov 24 13:33:54 compute-1 systemd-logind[815]: Session 41 logged out. Waiting for processes to exit.
Nov 24 13:33:54 compute-1 systemd-logind[815]: Removed session 41.
Nov 24 13:33:54 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:54.738 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:33:54 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:54.739 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:33:54 compute-1 nova_compute[187078]: 2025-11-24 13:33:54.740 187082 DEBUG nova.compute.manager [req-d3a76d4f-72a2-4751-97b9-eef8ead3536a req-4cc7847f-a956-443e-9ebe-b5a2dda28ea7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:54 compute-1 nova_compute[187078]: 2025-11-24 13:33:54.741 187082 DEBUG oslo_concurrency.lockutils [req-d3a76d4f-72a2-4751-97b9-eef8ead3536a req-4cc7847f-a956-443e-9ebe-b5a2dda28ea7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:54 compute-1 nova_compute[187078]: 2025-11-24 13:33:54.741 187082 DEBUG oslo_concurrency.lockutils [req-d3a76d4f-72a2-4751-97b9-eef8ead3536a req-4cc7847f-a956-443e-9ebe-b5a2dda28ea7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:54 compute-1 nova_compute[187078]: 2025-11-24 13:33:54.743 187082 DEBUG oslo_concurrency.lockutils [req-d3a76d4f-72a2-4751-97b9-eef8ead3536a req-4cc7847f-a956-443e-9ebe-b5a2dda28ea7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:54 compute-1 nova_compute[187078]: 2025-11-24 13:33:54.743 187082 DEBUG nova.compute.manager [req-d3a76d4f-72a2-4751-97b9-eef8ead3536a req-4cc7847f-a956-443e-9ebe-b5a2dda28ea7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:33:54 compute-1 nova_compute[187078]: 2025-11-24 13:33:54.743 187082 DEBUG nova.compute.manager [req-d3a76d4f-72a2-4751-97b9-eef8ead3536a req-4cc7847f-a956-443e-9ebe-b5a2dda28ea7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:33:54 compute-1 nova_compute[187078]: 2025-11-24 13:33:54.744 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.408 187082 INFO nova.compute.manager [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Took 3.41 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.409 187082 DEBUG nova.compute.manager [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.424 187082 DEBUG nova.compute.manager [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq4514j18',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='df6b16e5-93cd-49e7-a360-268fee816249',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(af9f6e5f-9b83-48b3-b1fc-17575dcce118),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.439 187082 DEBUG nova.objects.instance [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid df6b16e5-93cd-49e7-a360-268fee816249 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.440 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.442 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.442 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.460 187082 DEBUG nova.virt.libvirt.vif [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:32:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-890041496',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-890041496',id=17,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:32:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-scejrrau',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:32:50Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=df6b16e5-93cd-49e7-a360-268fee816249,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.460 187082 DEBUG nova.network.os_vif_util [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.461 187082 DEBUG nova.network.os_vif_util [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:8b:e7,bridge_name='br-int',has_traffic_filtering=True,id=df82121b-bc7d-43b5-8380-1339b831f456,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf82121b-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.462 187082 DEBUG nova.virt.libvirt.migration [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Updating guest XML with vif config: <interface type="ethernet">
Nov 24 13:33:55 compute-1 nova_compute[187078]:   <mac address="fa:16:3e:c2:8b:e7"/>
Nov 24 13:33:55 compute-1 nova_compute[187078]:   <model type="virtio"/>
Nov 24 13:33:55 compute-1 nova_compute[187078]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:33:55 compute-1 nova_compute[187078]:   <mtu size="1442"/>
Nov 24 13:33:55 compute-1 nova_compute[187078]:   <target dev="tapdf82121b-bc"/>
Nov 24 13:33:55 compute-1 nova_compute[187078]: </interface>
Nov 24 13:33:55 compute-1 nova_compute[187078]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.462 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.945 187082 DEBUG nova.virt.libvirt.migration [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:33:55 compute-1 nova_compute[187078]: 2025-11-24 13:33:55.946 187082 INFO nova.virt.libvirt.migration [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.001 187082 INFO nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.505 187082 DEBUG nova.virt.libvirt.migration [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.507 187082 DEBUG nova.virt.libvirt.migration [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.815 187082 DEBUG nova.compute.manager [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.816 187082 DEBUG oslo_concurrency.lockutils [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.816 187082 DEBUG oslo_concurrency.lockutils [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.816 187082 DEBUG oslo_concurrency.lockutils [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.816 187082 DEBUG nova.compute.manager [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.817 187082 WARNING nova.compute.manager [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received unexpected event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with vm_state active and task_state migrating.
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.817 187082 DEBUG nova.compute.manager [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-changed-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.817 187082 DEBUG nova.compute.manager [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Refreshing instance network info cache due to event network-changed-df82121b-bc7d-43b5-8380-1339b831f456. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.817 187082 DEBUG oslo_concurrency.lockutils [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.817 187082 DEBUG oslo_concurrency.lockutils [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.818 187082 DEBUG nova.network.neutron [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Refreshing network info cache for port df82121b-bc7d-43b5-8380-1339b831f456 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.989 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991236.989014, df6b16e5-93cd-49e7-a360-268fee816249 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:33:56 compute-1 nova_compute[187078]: 2025-11-24 13:33:56.990 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] VM Paused (Lifecycle Event)
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.009 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.015 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.022 187082 DEBUG nova.virt.libvirt.migration [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.023 187082 DEBUG nova.virt.libvirt.migration [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.033 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 24 13:33:57 compute-1 kernel: tapdf82121b-bc (unregistering): left promiscuous mode
Nov 24 13:33:57 compute-1 NetworkManager[55527]: <info>  [1763991237.1410] device (tapdf82121b-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.154 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:57 compute-1 ovn_controller[95368]: 2025-11-24T13:33:57Z|00168|binding|INFO|Releasing lport df82121b-bc7d-43b5-8380-1339b831f456 from this chassis (sb_readonly=0)
Nov 24 13:33:57 compute-1 ovn_controller[95368]: 2025-11-24T13:33:57Z|00169|binding|INFO|Setting lport df82121b-bc7d-43b5-8380-1339b831f456 down in Southbound
Nov 24 13:33:57 compute-1 ovn_controller[95368]: 2025-11-24T13:33:57Z|00170|binding|INFO|Removing iface tapdf82121b-bc ovn-installed in OVS
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.157 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.176 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:8b:e7 10.100.0.7'], port_security=['fa:16:3e:c2:8b:e7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'df6b16e5-93cd-49e7-a360-268fee816249', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=df82121b-bc7d-43b5-8380-1339b831f456) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.177 104225 INFO neutron.agent.ovn.metadata.agent [-] Port df82121b-bc7d-43b5-8380-1339b831f456 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.178 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.179 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9e99a8-8d52-4a8d-a711-5827abad9ff8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.180 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace which is not needed anymore
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.187 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:57 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Nov 24 13:33:57 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 16.307s CPU time.
Nov 24 13:33:57 compute-1 systemd-machined[153355]: Machine qemu-13-instance-00000011 terminated.
Nov 24 13:33:57 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[214730]: [NOTICE]   (214761) : haproxy version is 2.8.14-c23fe91
Nov 24 13:33:57 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[214730]: [NOTICE]   (214761) : path to executable is /usr/sbin/haproxy
Nov 24 13:33:57 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[214730]: [WARNING]  (214761) : Exiting Master process...
Nov 24 13:33:57 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[214730]: [ALERT]    (214761) : Current worker (214769) exited with code 143 (Terminated)
Nov 24 13:33:57 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[214730]: [WARNING]  (214761) : All workers exited. Exiting... (0)
Nov 24 13:33:57 compute-1 systemd[1]: libpod-24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad.scope: Deactivated successfully.
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.346 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:57 compute-1 podman[215105]: 2025-11-24 13:33:57.346777179 +0000 UTC m=+0.058395258 container died 24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.356 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad-userdata-shm.mount: Deactivated successfully.
Nov 24 13:33:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-a51f16aaf78b7f8f1939b431baab81b71f71266017a763d31dba7d2dec9777fc-merged.mount: Deactivated successfully.
Nov 24 13:33:57 compute-1 podman[215105]: 2025-11-24 13:33:57.401098854 +0000 UTC m=+0.112716903 container cleanup 24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.411 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.411 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.412 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 24 13:33:57 compute-1 systemd[1]: libpod-conmon-24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad.scope: Deactivated successfully.
Nov 24 13:33:57 compute-1 podman[215151]: 2025-11-24 13:33:57.479646898 +0000 UTC m=+0.048444487 container remove 24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.486 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[79c7a5c6-9619-4869-a4b9-4cfd53a9f49f]: (4, ('Mon Nov 24 01:33:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad)\n24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad\nMon Nov 24 01:33:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad)\n24dde7880de467d6e282e4afb6e25356ef305bcb84f9c2b83bb63e0e79c100ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.488 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab21c57-a308-4244-a17e-8bfa4829ca27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.489 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.528 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:57 compute-1 kernel: tapee6bf4e1-a0: left promiscuous mode
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.530 187082 DEBUG nova.virt.libvirt.guest [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'df6b16e5-93cd-49e7-a360-268fee816249' (instance-00000011) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.530 187082 INFO nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Migration operation has completed
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.531 187082 INFO nova.compute.manager [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] _post_live_migration() is started..
Nov 24 13:33:57 compute-1 nova_compute[187078]: 2025-11-24 13:33:57.551 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.554 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4e292900-a286-4c99-b553-fd9d3cc29bcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.569 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[56d7b23a-b129-48eb-903c-b088fddc47f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.570 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6bdd25-4845-4877-89cd-399be1cf5ce0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.590 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2de869-9d6c-43bc-bbbf-656348b72c02]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418853, 'reachable_time': 32900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215173, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.593 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:33:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:57.593 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1f5002-5eff-4500-bf76-8202069c80f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:33:57 compute-1 systemd[1]: run-netns-ovnmeta\x2dee6bf4e1\x2dadcd\x2d4f6c\x2d8b46\x2deaa71e64e9c0.mount: Deactivated successfully.
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.022 187082 DEBUG nova.network.neutron [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Updated VIF entry in instance network info cache for port df82121b-bc7d-43b5-8380-1339b831f456. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.022 187082 DEBUG nova.network.neutron [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Updating instance_info_cache with network_info: [{"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.039 187082 DEBUG oslo_concurrency.lockutils [req-f2eaf68e-a3d5-4a45-ad1d-e1585c988f25 req-e1777d30-85d2-49b8-9f68-2a697bafa180 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-df6b16e5-93cd-49e7-a360-268fee816249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.201 187082 DEBUG nova.network.neutron [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Activated binding for port df82121b-bc7d-43b5-8380-1339b831f456 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.202 187082 DEBUG nova.compute.manager [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.203 187082 DEBUG nova.virt.libvirt.vif [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:32:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-890041496',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-890041496',id=17,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:32:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-scejrrau',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:33:48Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=df6b16e5-93cd-49e7-a360-268fee816249,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.203 187082 DEBUG nova.network.os_vif_util [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "df82121b-bc7d-43b5-8380-1339b831f456", "address": "fa:16:3e:c2:8b:e7", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf82121b-bc", "ovs_interfaceid": "df82121b-bc7d-43b5-8380-1339b831f456", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.205 187082 DEBUG nova.network.os_vif_util [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:8b:e7,bridge_name='br-int',has_traffic_filtering=True,id=df82121b-bc7d-43b5-8380-1339b831f456,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf82121b-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.205 187082 DEBUG os_vif [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:8b:e7,bridge_name='br-int',has_traffic_filtering=True,id=df82121b-bc7d-43b5-8380-1339b831f456,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf82121b-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.208 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.208 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf82121b-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.211 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.214 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.217 187082 INFO os_vif [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:8b:e7,bridge_name='br-int',has_traffic_filtering=True,id=df82121b-bc7d-43b5-8380-1339b831f456,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf82121b-bc')
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.218 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.219 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.219 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.220 187082 DEBUG nova.compute.manager [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.220 187082 INFO nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Deleting instance files /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249_del
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.222 187082 INFO nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Deletion of /var/lib/nova/instances/df6b16e5-93cd-49e7-a360-268fee816249_del complete
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.472 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.897 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.897 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.897 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.897 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.898 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.898 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.898 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.899 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.899 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.899 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.900 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.900 187082 WARNING nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received unexpected event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with vm_state active and task_state migrating.
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.900 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.900 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.900 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.900 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.901 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.901 187082 WARNING nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received unexpected event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with vm_state active and task_state migrating.
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.901 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.901 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.901 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.902 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.902 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.902 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-unplugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.902 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.902 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.903 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.903 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.903 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.903 187082 WARNING nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received unexpected event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with vm_state active and task_state migrating.
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.903 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.904 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.904 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.904 187082 DEBUG oslo_concurrency.lockutils [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.904 187082 DEBUG nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] No waiting events found dispatching network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:33:58 compute-1 nova_compute[187078]: 2025-11-24 13:33:58.904 187082 WARNING nova.compute.manager [req-f167df10-0bab-4909-93e2-f6a84764ba4e req-6fc235d6-1b0c-473f-b529-3849b9cd4362 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Received unexpected event network-vif-plugged-df82121b-bc7d-43b5-8380-1339b831f456 for instance with vm_state active and task_state migrating.
Nov 24 13:33:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:33:59.741 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.371 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "df6b16e5-93cd-49e7-a360-268fee816249-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.372 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.372 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "df6b16e5-93cd-49e7-a360-268fee816249-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.426 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.427 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.428 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.429 187082 DEBUG nova.compute.resource_tracker [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:34:02 compute-1 podman[215174]: 2025-11-24 13:34:02.5592365 +0000 UTC m=+0.094212880 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.674 187082 WARNING nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.677 187082 DEBUG nova.compute.resource_tracker [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5859MB free_disk=73.4601058959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.678 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.679 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.718 187082 DEBUG nova.compute.resource_tracker [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration for instance df6b16e5-93cd-49e7-a360-268fee816249 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.739 187082 DEBUG nova.compute.resource_tracker [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.771 187082 DEBUG nova.compute.resource_tracker [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration af9f6e5f-9b83-48b3-b1fc-17575dcce118 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.771 187082 DEBUG nova.compute.resource_tracker [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.772 187082 DEBUG nova.compute.resource_tracker [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.819 187082 DEBUG nova.compute.provider_tree [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.829 187082 DEBUG nova.scheduler.client.report [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.845 187082 DEBUG nova.compute.resource_tracker [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.845 187082 DEBUG oslo_concurrency.lockutils [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.850 187082 INFO nova.compute.manager [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.911 187082 INFO nova.scheduler.client.report [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration af9f6e5f-9b83-48b3-b1fc-17575dcce118
Nov 24 13:34:02 compute-1 nova_compute[187078]: 2025-11-24 13:34:02.911 187082 DEBUG nova.virt.libvirt.driver [None req-e6b213a6-c905-4613-adb6-64032dc6f5b7 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 24 13:34:03 compute-1 nova_compute[187078]: 2025-11-24 13:34:03.241 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:03 compute-1 nova_compute[187078]: 2025-11-24 13:34:03.474 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:34:04.162 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:34:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:34:04.163 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:34:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:34:04.163 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:34:04 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Nov 24 13:34:04 compute-1 systemd[215052]: Activating special unit Exit the Session...
Nov 24 13:34:04 compute-1 systemd[215052]: Stopped target Main User Target.
Nov 24 13:34:04 compute-1 systemd[215052]: Stopped target Basic System.
Nov 24 13:34:04 compute-1 systemd[215052]: Stopped target Paths.
Nov 24 13:34:04 compute-1 systemd[215052]: Stopped target Sockets.
Nov 24 13:34:04 compute-1 systemd[215052]: Stopped target Timers.
Nov 24 13:34:04 compute-1 systemd[215052]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:34:04 compute-1 systemd[215052]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:34:04 compute-1 systemd[215052]: Closed D-Bus User Message Bus Socket.
Nov 24 13:34:04 compute-1 systemd[215052]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:34:04 compute-1 systemd[215052]: Removed slice User Application Slice.
Nov 24 13:34:04 compute-1 systemd[215052]: Reached target Shutdown.
Nov 24 13:34:04 compute-1 systemd[215052]: Finished Exit the Session.
Nov 24 13:34:04 compute-1 systemd[215052]: Reached target Exit the Session.
Nov 24 13:34:04 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Nov 24 13:34:04 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Nov 24 13:34:04 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 24 13:34:04 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 24 13:34:04 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 24 13:34:04 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 24 13:34:04 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Nov 24 13:34:05 compute-1 podman[197429]: time="2025-11-24T13:34:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:34:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:34:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:34:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:34:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Nov 24 13:34:08 compute-1 nova_compute[187078]: 2025-11-24 13:34:08.244 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:08 compute-1 nova_compute[187078]: 2025-11-24 13:34:08.476 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:12 compute-1 nova_compute[187078]: 2025-11-24 13:34:12.409 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991237.40685, df6b16e5-93cd-49e7-a360-268fee816249 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:34:12 compute-1 nova_compute[187078]: 2025-11-24 13:34:12.409 187082 INFO nova.compute.manager [-] [instance: df6b16e5-93cd-49e7-a360-268fee816249] VM Stopped (Lifecycle Event)
Nov 24 13:34:12 compute-1 nova_compute[187078]: 2025-11-24 13:34:12.431 187082 DEBUG nova.compute.manager [None req-b8b2a1f2-67d5-4d39-b1ce-14d217f20a55 - - - - - -] [instance: df6b16e5-93cd-49e7-a360-268fee816249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:34:12 compute-1 nova_compute[187078]: 2025-11-24 13:34:12.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:13 compute-1 nova_compute[187078]: 2025-11-24 13:34:13.246 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:13 compute-1 nova_compute[187078]: 2025-11-24 13:34:13.479 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:14 compute-1 nova_compute[187078]: 2025-11-24 13:34:14.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:16 compute-1 nova_compute[187078]: 2025-11-24 13:34:16.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:17 compute-1 podman[215198]: 2025-11-24 13:34:17.527607634 +0000 UTC m=+0.055220502 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:34:17 compute-1 podman[215199]: 2025-11-24 13:34:17.542702554 +0000 UTC m=+0.071088643 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 24 13:34:17 compute-1 nova_compute[187078]: 2025-11-24 13:34:17.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.296 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.481 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.694 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.694 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.694 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.874 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.875 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5868MB free_disk=73.4601058959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.875 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.875 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.932 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.933 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.964 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.988 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:34:18 compute-1 nova_compute[187078]: 2025-11-24 13:34:18.989 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:34:19 compute-1 nova_compute[187078]: 2025-11-24 13:34:19.006 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:34:19 compute-1 nova_compute[187078]: 2025-11-24 13:34:19.029 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:34:19 compute-1 nova_compute[187078]: 2025-11-24 13:34:19.058 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:34:19 compute-1 nova_compute[187078]: 2025-11-24 13:34:19.074 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:34:19 compute-1 nova_compute[187078]: 2025-11-24 13:34:19.076 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:34:19 compute-1 nova_compute[187078]: 2025-11-24 13:34:19.076 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:34:19 compute-1 openstack_network_exporter[199599]: ERROR   13:34:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:34:19 compute-1 openstack_network_exporter[199599]: ERROR   13:34:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:34:19 compute-1 openstack_network_exporter[199599]: ERROR   13:34:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:34:19 compute-1 openstack_network_exporter[199599]: ERROR   13:34:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:34:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:34:19 compute-1 openstack_network_exporter[199599]: ERROR   13:34:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:34:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:34:21 compute-1 nova_compute[187078]: 2025-11-24 13:34:21.076 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:21 compute-1 nova_compute[187078]: 2025-11-24 13:34:21.077 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:34:21 compute-1 nova_compute[187078]: 2025-11-24 13:34:21.078 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:34:21 compute-1 nova_compute[187078]: 2025-11-24 13:34:21.091 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:34:21 compute-1 nova_compute[187078]: 2025-11-24 13:34:21.093 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:21 compute-1 nova_compute[187078]: 2025-11-24 13:34:21.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:21 compute-1 nova_compute[187078]: 2025-11-24 13:34:21.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:34:22 compute-1 podman[215237]: 2025-11-24 13:34:22.531316196 +0000 UTC m=+0.079542483 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:34:22 compute-1 podman[215238]: 2025-11-24 13:34:22.589443605 +0000 UTC m=+0.136604453 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Nov 24 13:34:23 compute-1 nova_compute[187078]: 2025-11-24 13:34:23.345 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:23 compute-1 nova_compute[187078]: 2025-11-24 13:34:23.483 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:24 compute-1 nova_compute[187078]: 2025-11-24 13:34:24.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:25 compute-1 sshd-session[215283]: Invalid user apagar from 68.183.82.237 port 39236
Nov 24 13:34:25 compute-1 sshd-session[215283]: Received disconnect from 68.183.82.237 port 39236:11: Bye Bye [preauth]
Nov 24 13:34:25 compute-1 sshd-session[215283]: Disconnected from invalid user apagar 68.183.82.237 port 39236 [preauth]
Nov 24 13:34:26 compute-1 nova_compute[187078]: 2025-11-24 13:34:26.659 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:34:28 compute-1 nova_compute[187078]: 2025-11-24 13:34:28.348 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:28 compute-1 nova_compute[187078]: 2025-11-24 13:34:28.485 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:30 compute-1 sshd-session[215285]: Received disconnect from 5.198.176.28 port 44978:11: Bye Bye [preauth]
Nov 24 13:34:30 compute-1 sshd-session[215285]: Disconnected from authenticating user root 5.198.176.28 port 44978 [preauth]
Nov 24 13:34:32 compute-1 sshd-session[215287]: Invalid user copia from 175.100.24.139 port 39498
Nov 24 13:34:33 compute-1 podman[215290]: 2025-11-24 13:34:33.006280197 +0000 UTC m=+0.061786020 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7)
Nov 24 13:34:33 compute-1 sshd-session[215287]: Received disconnect from 175.100.24.139 port 39498:11: Bye Bye [preauth]
Nov 24 13:34:33 compute-1 sshd-session[215287]: Disconnected from invalid user copia 175.100.24.139 port 39498 [preauth]
Nov 24 13:34:33 compute-1 nova_compute[187078]: 2025-11-24 13:34:33.390 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:33 compute-1 nova_compute[187078]: 2025-11-24 13:34:33.487 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:35 compute-1 podman[197429]: time="2025-11-24T13:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:34:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:34:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:34:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:34:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 24 13:34:38 compute-1 nova_compute[187078]: 2025-11-24 13:34:38.394 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:38 compute-1 nova_compute[187078]: 2025-11-24 13:34:38.489 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:38 compute-1 ovn_controller[95368]: 2025-11-24T13:34:38Z|00171|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Nov 24 13:34:43 compute-1 nova_compute[187078]: 2025-11-24 13:34:43.397 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:43 compute-1 nova_compute[187078]: 2025-11-24 13:34:43.493 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:43 compute-1 sshd-session[215310]: Invalid user validator from 193.32.162.146 port 39204
Nov 24 13:34:43 compute-1 sshd-session[215310]: Connection closed by invalid user validator 193.32.162.146 port 39204 [preauth]
Nov 24 13:34:45 compute-1 sshd-session[215312]: Received disconnect from 193.46.255.7 port 10768:11:  [preauth]
Nov 24 13:34:45 compute-1 sshd-session[215312]: Disconnected from authenticating user root 193.46.255.7 port 10768 [preauth]
Nov 24 13:34:48 compute-1 nova_compute[187078]: 2025-11-24 13:34:48.399 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:48 compute-1 nova_compute[187078]: 2025-11-24 13:34:48.495 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:48 compute-1 podman[215316]: 2025-11-24 13:34:48.500674837 +0000 UTC m=+0.050038261 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:34:48 compute-1 podman[215317]: 2025-11-24 13:34:48.508565132 +0000 UTC m=+0.057715390 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:34:48 compute-1 sshd-session[215314]: Invalid user sens from 176.114.89.34 port 39018
Nov 24 13:34:48 compute-1 sshd-session[215314]: Received disconnect from 176.114.89.34 port 39018:11: Bye Bye [preauth]
Nov 24 13:34:48 compute-1 sshd-session[215314]: Disconnected from invalid user sens 176.114.89.34 port 39018 [preauth]
Nov 24 13:34:49 compute-1 openstack_network_exporter[199599]: ERROR   13:34:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:34:49 compute-1 openstack_network_exporter[199599]: ERROR   13:34:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:34:49 compute-1 openstack_network_exporter[199599]: ERROR   13:34:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:34:49 compute-1 openstack_network_exporter[199599]: ERROR   13:34:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:34:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:34:49 compute-1 openstack_network_exporter[199599]: ERROR   13:34:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:34:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:34:53 compute-1 nova_compute[187078]: 2025-11-24 13:34:53.448 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:53 compute-1 nova_compute[187078]: 2025-11-24 13:34:53.497 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:53 compute-1 podman[215359]: 2025-11-24 13:34:53.542775573 +0000 UTC m=+0.071417163 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd)
Nov 24 13:34:53 compute-1 podman[215360]: 2025-11-24 13:34:53.562327534 +0000 UTC m=+0.088006543 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 24 13:34:58 compute-1 nova_compute[187078]: 2025-11-24 13:34:58.452 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:34:58 compute-1 nova_compute[187078]: 2025-11-24 13:34:58.497 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.271 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.272 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.289 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.359 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.359 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.366 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.366 187082 INFO nova.compute.claims [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.449 187082 DEBUG nova.compute.provider_tree [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.479 187082 DEBUG nova.scheduler.client.report [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.499 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.500 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.541 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.541 187082 DEBUG nova.network.neutron [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.558 187082 INFO nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.573 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.656 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.657 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.657 187082 INFO nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Creating image(s)
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.657 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "/var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.658 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.658 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.669 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.724 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.725 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.725 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.736 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.791 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.792 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.823 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.824 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.825 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.878 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.879 187082 DEBUG nova.virt.disk.api [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Checking if we can resize image /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.880 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.934 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.935 187082 DEBUG nova.virt.disk.api [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Cannot resize image /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.936 187082 DEBUG nova.objects.instance [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'migration_context' on Instance uuid 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.952 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.952 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Ensure instance console log exists: /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.953 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.953 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:01 compute-1 nova_compute[187078]: 2025-11-24 13:35:01.953 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:02 compute-1 nova_compute[187078]: 2025-11-24 13:35:02.427 187082 DEBUG nova.policy [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44609a4d2fa941a4b26d6b27a5d4a6d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a66bcdc071b741ef8709a4608acd6051', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:35:03 compute-1 nova_compute[187078]: 2025-11-24 13:35:03.455 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:03 compute-1 nova_compute[187078]: 2025-11-24 13:35:03.499 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:03 compute-1 podman[215415]: 2025-11-24 13:35:03.515620186 +0000 UTC m=+0.061166564 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:35:03 compute-1 nova_compute[187078]: 2025-11-24 13:35:03.531 187082 DEBUG nova.network.neutron [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Successfully created port: 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:35:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:04.164 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:04.164 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:04.164 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:04 compute-1 nova_compute[187078]: 2025-11-24 13:35:04.656 187082 DEBUG nova.network.neutron [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Successfully updated port: 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:35:04 compute-1 nova_compute[187078]: 2025-11-24 13:35:04.667 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:35:04 compute-1 nova_compute[187078]: 2025-11-24 13:35:04.667 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquired lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:35:04 compute-1 nova_compute[187078]: 2025-11-24 13:35:04.668 187082 DEBUG nova.network.neutron [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:35:04 compute-1 nova_compute[187078]: 2025-11-24 13:35:04.732 187082 DEBUG nova.compute.manager [req-c204266a-e439-4c3b-bda5-e53414524d51 req-16cd38be-670b-4e8a-877a-cca87cbbc006 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-changed-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:35:04 compute-1 nova_compute[187078]: 2025-11-24 13:35:04.733 187082 DEBUG nova.compute.manager [req-c204266a-e439-4c3b-bda5-e53414524d51 req-16cd38be-670b-4e8a-877a-cca87cbbc006 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Refreshing instance network info cache due to event network-changed-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:35:04 compute-1 nova_compute[187078]: 2025-11-24 13:35:04.733 187082 DEBUG oslo_concurrency.lockutils [req-c204266a-e439-4c3b-bda5-e53414524d51 req-16cd38be-670b-4e8a-877a-cca87cbbc006 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:35:04 compute-1 nova_compute[187078]: 2025-11-24 13:35:04.803 187082 DEBUG nova.network.neutron [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:35:05 compute-1 podman[197429]: time="2025-11-24T13:35:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:35:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:35:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:35:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:35:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.792 187082 DEBUG nova.network.neutron [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Updating instance_info_cache with network_info: [{"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.816 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Releasing lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.816 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Instance network_info: |[{"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.817 187082 DEBUG oslo_concurrency.lockutils [req-c204266a-e439-4c3b-bda5-e53414524d51 req-16cd38be-670b-4e8a-877a-cca87cbbc006 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.817 187082 DEBUG nova.network.neutron [req-c204266a-e439-4c3b-bda5-e53414524d51 req-16cd38be-670b-4e8a-877a-cca87cbbc006 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Refreshing network info cache for port 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.820 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Start _get_guest_xml network_info=[{"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.825 187082 WARNING nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.834 187082 DEBUG nova.virt.libvirt.host [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.835 187082 DEBUG nova.virt.libvirt.host [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.839 187082 DEBUG nova.virt.libvirt.host [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.840 187082 DEBUG nova.virt.libvirt.host [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.841 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.841 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.842 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.842 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.842 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.842 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.842 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.843 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.843 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.843 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.843 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.843 187082 DEBUG nova.virt.hardware [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.847 187082 DEBUG nova.virt.libvirt.vif [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:35:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1773054992',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1773054992',id=20,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-wldlv05c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:35:01Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=64ec0fe3-a86c-4fe4-acd6-a449e46530e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.847 187082 DEBUG nova.network.os_vif_util [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.848 187082 DEBUG nova.network.os_vif_util [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:ee:8f,bridge_name='br-int',has_traffic_filtering=True,id=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ddc2d5-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.848 187082 DEBUG nova.objects.instance [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'pci_devices' on Instance uuid 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.860 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <uuid>64ec0fe3-a86c-4fe4-acd6-a449e46530e0</uuid>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <name>instance-00000014</name>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteStrategies-server-1773054992</nova:name>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:35:05</nova:creationTime>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:35:05 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:35:05 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:35:05 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:35:05 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:35:05 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:35:05 compute-1 nova_compute[187078]:         <nova:user uuid="44609a4d2fa941a4b26d6b27a5d4a6d2">tempest-TestExecuteStrategies-392394962-project-member</nova:user>
Nov 24 13:35:05 compute-1 nova_compute[187078]:         <nova:project uuid="a66bcdc071b741ef8709a4608acd6051">tempest-TestExecuteStrategies-392394962</nova:project>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:35:05 compute-1 nova_compute[187078]:         <nova:port uuid="49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02">
Nov 24 13:35:05 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <system>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <entry name="serial">64ec0fe3-a86c-4fe4-acd6-a449e46530e0</entry>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <entry name="uuid">64ec0fe3-a86c-4fe4-acd6-a449e46530e0</entry>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </system>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <os>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   </os>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <features>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   </features>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk.config"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:59:ee:8f"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <target dev="tap49ddc2d5-3f"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/console.log" append="off"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <video>
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </video>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:35:05 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:35:05 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:35:05 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:35:05 compute-1 nova_compute[187078]: </domain>
Nov 24 13:35:05 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.861 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Preparing to wait for external event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.861 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.862 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.862 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.862 187082 DEBUG nova.virt.libvirt.vif [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:35:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1773054992',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1773054992',id=20,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-wldlv05c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:35:01Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=64ec0fe3-a86c-4fe4-acd6-a449e46530e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.863 187082 DEBUG nova.network.os_vif_util [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.863 187082 DEBUG nova.network.os_vif_util [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:ee:8f,bridge_name='br-int',has_traffic_filtering=True,id=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ddc2d5-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.864 187082 DEBUG os_vif [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:ee:8f,bridge_name='br-int',has_traffic_filtering=True,id=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ddc2d5-3f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.864 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.865 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.865 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.867 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.867 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49ddc2d5-3f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.868 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap49ddc2d5-3f, col_values=(('external_ids', {'iface-id': '49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:ee:8f', 'vm-uuid': '64ec0fe3-a86c-4fe4-acd6-a449e46530e0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:35:05 compute-1 NetworkManager[55527]: <info>  [1763991305.8701] manager: (tap49ddc2d5-3f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.870 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.874 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.876 187082 INFO os_vif [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:ee:8f,bridge_name='br-int',has_traffic_filtering=True,id=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ddc2d5-3f')
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.915 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.915 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.915 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No VIF found with MAC fa:16:3e:59:ee:8f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:35:05 compute-1 nova_compute[187078]: 2025-11-24 13:35:05.915 187082 INFO nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Using config drive
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.526 187082 INFO nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Creating config drive at /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk.config
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.531 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbcyj3411 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.651 187082 DEBUG oslo_concurrency.processutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbcyj3411" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:06 compute-1 kernel: tap49ddc2d5-3f: entered promiscuous mode
Nov 24 13:35:06 compute-1 NetworkManager[55527]: <info>  [1763991306.7300] manager: (tap49ddc2d5-3f): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.749 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:06 compute-1 ovn_controller[95368]: 2025-11-24T13:35:06Z|00172|binding|INFO|Claiming lport 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 for this chassis.
Nov 24 13:35:06 compute-1 ovn_controller[95368]: 2025-11-24T13:35:06Z|00173|binding|INFO|49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02: Claiming fa:16:3e:59:ee:8f 10.100.0.5
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.759 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:ee:8f 10.100.0.5'], port_security=['fa:16:3e:59:ee:8f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '64ec0fe3-a86c-4fe4-acd6-a449e46530e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:35:06 compute-1 ovn_controller[95368]: 2025-11-24T13:35:06Z|00174|binding|INFO|Setting lport 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 ovn-installed in OVS
Nov 24 13:35:06 compute-1 ovn_controller[95368]: 2025-11-24T13:35:06Z|00175|binding|INFO|Setting lport 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 up in Southbound
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.761 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.763 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.764 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.766 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.769 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:06 compute-1 systemd-udevd[215456]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.783 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b61deee3-41d5-4156-af1c-d00e888a551d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.786 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee6bf4e1-a1 in ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:35:06 compute-1 NetworkManager[55527]: <info>  [1763991306.7919] device (tap49ddc2d5-3f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:35:06 compute-1 NetworkManager[55527]: <info>  [1763991306.7927] device (tap49ddc2d5-3f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.790 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee6bf4e1-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.790 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[160df25a-6e97-4079-a1b0-8eb142887bd0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.794 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9499df-917b-467f-85fe-7c7e40de6429]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 systemd-machined[153355]: New machine qemu-14-instance-00000014.
Nov 24 13:35:06 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-00000014.
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.806 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[98575c1d-3305-486c-9bed-7102d8b3280d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.828 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[780067f4-c91b-4164-8d87-ef12fbe76e0b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.863 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[99ced11c-2104-4b84-a50c-c2410d05bcf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 NetworkManager[55527]: <info>  [1763991306.8721] manager: (tapee6bf4e1-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/68)
Nov 24 13:35:06 compute-1 systemd-udevd[215459]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.872 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b618bc4d-20d0-4030-b502-54da8042b15c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.910 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[04663a38-c0cf-4bf8-8ea7-4ae5286253f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.912 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[f9269f70-f62c-48c1-b814-f0a62c80b4af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 NetworkManager[55527]: <info>  [1763991306.9434] device (tapee6bf4e1-a0): carrier: link connected
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.949 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[5cedd9cb-651f-4a65-8043-58f6fae574f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.961 187082 DEBUG nova.compute.manager [req-f9d1fe86-fbc5-43f3-8420-8d00f577f7d5 req-6879a218-8d44-4e0f-bb5c-50a6fd2a5465 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.961 187082 DEBUG oslo_concurrency.lockutils [req-f9d1fe86-fbc5-43f3-8420-8d00f577f7d5 req-6879a218-8d44-4e0f-bb5c-50a6fd2a5465 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.962 187082 DEBUG oslo_concurrency.lockutils [req-f9d1fe86-fbc5-43f3-8420-8d00f577f7d5 req-6879a218-8d44-4e0f-bb5c-50a6fd2a5465 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.962 187082 DEBUG oslo_concurrency.lockutils [req-f9d1fe86-fbc5-43f3-8420-8d00f577f7d5 req-6879a218-8d44-4e0f-bb5c-50a6fd2a5465 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:06 compute-1 nova_compute[187078]: 2025-11-24 13:35:06.962 187082 DEBUG nova.compute.manager [req-f9d1fe86-fbc5-43f3-8420-8d00f577f7d5 req-6879a218-8d44-4e0f-bb5c-50a6fd2a5465 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Processing event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.968 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ecda22e0-9f5b-44c9-916c-26b696589a43]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432560, 'reachable_time': 28469, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215489, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:06.985 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e52ed879-d0db-48aa-bec5-6d0d15248016]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:5bc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 432560, 'tstamp': 432560}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215490, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.004 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[c0fddc12-a756-4318-a6c8-a68df94408bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432560, 'reachable_time': 28469, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215491, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.040 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ba41abd3-8dae-4cb2-9c6c-0c03b0fd0e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.107 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[88654c60-c525-4bd5-813d-ba5e355947b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.109 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.110 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.110 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:35:07 compute-1 kernel: tapee6bf4e1-a0: entered promiscuous mode
Nov 24 13:35:07 compute-1 NetworkManager[55527]: <info>  [1763991307.1131] manager: (tapee6bf4e1-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.113 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.115 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:35:07 compute-1 ovn_controller[95368]: 2025-11-24T13:35:07Z|00176|binding|INFO|Releasing lport 3f7bb31c-e9f4-4c4a-ad4a-8451f233926d from this chassis (sb_readonly=0)
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.118 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.119 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[35b4cccf-3824-4c63-b16e-96b85ec17ac8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.119 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:35:07 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:07.121 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'env', 'PROCESS_TAG=haproxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.129 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.289 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.290 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991307.289224, 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.291 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] VM Started (Lifecycle Event)
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.294 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.310 187082 INFO nova.virt.libvirt.driver [-] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Instance spawned successfully.
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.312 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.313 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.320 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.327 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.328 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.328 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.329 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.329 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.329 187082 DEBUG nova.virt.libvirt.driver [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.345 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.345 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991307.2895207, 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.345 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] VM Paused (Lifecycle Event)
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.363 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.368 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991307.293289, 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.368 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] VM Resumed (Lifecycle Event)
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.372 187082 INFO nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Took 5.72 seconds to spawn the instance on the hypervisor.
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.372 187082 DEBUG nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.400 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.404 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.436 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.458 187082 INFO nova.compute.manager [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Took 6.12 seconds to build instance.
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.468 187082 DEBUG nova.network.neutron [req-c204266a-e439-4c3b-bda5-e53414524d51 req-16cd38be-670b-4e8a-877a-cca87cbbc006 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Updated VIF entry in instance network info cache for port 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.468 187082 DEBUG nova.network.neutron [req-c204266a-e439-4c3b-bda5-e53414524d51 req-16cd38be-670b-4e8a-877a-cca87cbbc006 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Updating instance_info_cache with network_info: [{"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.471 187082 DEBUG oslo_concurrency.lockutils [None req-88baf6b3-3230-4d11-9212-b07231349b17 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:07 compute-1 nova_compute[187078]: 2025-11-24 13:35:07.480 187082 DEBUG oslo_concurrency.lockutils [req-c204266a-e439-4c3b-bda5-e53414524d51 req-16cd38be-670b-4e8a-877a-cca87cbbc006 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:35:07 compute-1 podman[215528]: 2025-11-24 13:35:07.482265456 +0000 UTC m=+0.049105805 container create d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:35:07 compute-1 systemd[1]: Started libpod-conmon-d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046.scope.
Nov 24 13:35:07 compute-1 podman[215528]: 2025-11-24 13:35:07.457465462 +0000 UTC m=+0.024305831 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:35:07 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:35:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8eb380a3b14c6c50c772294f31d37729c71531f27cd82beb6070ffe5ca45f573/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:35:07 compute-1 podman[215528]: 2025-11-24 13:35:07.578414109 +0000 UTC m=+0.145254478 container init d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 13:35:07 compute-1 podman[215528]: 2025-11-24 13:35:07.589033048 +0000 UTC m=+0.155873397 container start d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 13:35:07 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[215544]: [NOTICE]   (215548) : New worker (215550) forked
Nov 24 13:35:07 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[215544]: [NOTICE]   (215548) : Loading success.
Nov 24 13:35:08 compute-1 nova_compute[187078]: 2025-11-24 13:35:08.500 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:09 compute-1 nova_compute[187078]: 2025-11-24 13:35:09.025 187082 DEBUG nova.compute.manager [req-0a9f91fb-3663-424a-a551-8d9fa797d582 req-ccaf6cf1-9839-4993-bebc-b3a9441e47ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:35:09 compute-1 nova_compute[187078]: 2025-11-24 13:35:09.026 187082 DEBUG oslo_concurrency.lockutils [req-0a9f91fb-3663-424a-a551-8d9fa797d582 req-ccaf6cf1-9839-4993-bebc-b3a9441e47ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:09 compute-1 nova_compute[187078]: 2025-11-24 13:35:09.026 187082 DEBUG oslo_concurrency.lockutils [req-0a9f91fb-3663-424a-a551-8d9fa797d582 req-ccaf6cf1-9839-4993-bebc-b3a9441e47ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:09 compute-1 nova_compute[187078]: 2025-11-24 13:35:09.026 187082 DEBUG oslo_concurrency.lockutils [req-0a9f91fb-3663-424a-a551-8d9fa797d582 req-ccaf6cf1-9839-4993-bebc-b3a9441e47ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:09 compute-1 nova_compute[187078]: 2025-11-24 13:35:09.027 187082 DEBUG nova.compute.manager [req-0a9f91fb-3663-424a-a551-8d9fa797d582 req-ccaf6cf1-9839-4993-bebc-b3a9441e47ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] No waiting events found dispatching network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:35:09 compute-1 nova_compute[187078]: 2025-11-24 13:35:09.027 187082 WARNING nova.compute.manager [req-0a9f91fb-3663-424a-a551-8d9fa797d582 req-ccaf6cf1-9839-4993-bebc-b3a9441e47ed 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received unexpected event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 for instance with vm_state active and task_state None.
Nov 24 13:35:10 compute-1 nova_compute[187078]: 2025-11-24 13:35:10.870 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:12 compute-1 nova_compute[187078]: 2025-11-24 13:35:12.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:12 compute-1 nova_compute[187078]: 2025-11-24 13:35:12.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:13 compute-1 nova_compute[187078]: 2025-11-24 13:35:13.503 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:14 compute-1 nova_compute[187078]: 2025-11-24 13:35:14.677 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:15 compute-1 nova_compute[187078]: 2025-11-24 13:35:15.872 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:16 compute-1 nova_compute[187078]: 2025-11-24 13:35:16.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.505 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.693 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.777 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:18 compute-1 podman[215561]: 2025-11-24 13:35:18.808121301 +0000 UTC m=+0.052953080 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 13:35:18 compute-1 podman[215559]: 2025-11-24 13:35:18.821642158 +0000 UTC m=+0.069613432 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.870 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.872 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:18 compute-1 nova_compute[187078]: 2025-11-24 13:35:18.945 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.152 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.153 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5687MB free_disk=73.45927429199219GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.154 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.154 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.276 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.277 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.277 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.406 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:35:19 compute-1 openstack_network_exporter[199599]: ERROR   13:35:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:35:19 compute-1 openstack_network_exporter[199599]: ERROR   13:35:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:35:19 compute-1 openstack_network_exporter[199599]: ERROR   13:35:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:35:19 compute-1 openstack_network_exporter[199599]: ERROR   13:35:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:35:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:35:19 compute-1 openstack_network_exporter[199599]: ERROR   13:35:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:35:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.461 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.481 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:35:19 compute-1 nova_compute[187078]: 2025-11-24 13:35:19.481 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:19 compute-1 ovn_controller[95368]: 2025-11-24T13:35:19Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:59:ee:8f 10.100.0.5
Nov 24 13:35:19 compute-1 ovn_controller[95368]: 2025-11-24T13:35:19Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:59:ee:8f 10.100.0.5
Nov 24 13:35:20 compute-1 nova_compute[187078]: 2025-11-24 13:35:20.875 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:22 compute-1 nova_compute[187078]: 2025-11-24 13:35:22.481 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:22 compute-1 nova_compute[187078]: 2025-11-24 13:35:22.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:22 compute-1 nova_compute[187078]: 2025-11-24 13:35:22.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:35:22 compute-1 nova_compute[187078]: 2025-11-24 13:35:22.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:35:23 compute-1 sshd-session[215623]: Invalid user sol from 45.148.10.240 port 54848
Nov 24 13:35:23 compute-1 sshd-session[215623]: Connection closed by invalid user sol 45.148.10.240 port 54848 [preauth]
Nov 24 13:35:23 compute-1 nova_compute[187078]: 2025-11-24 13:35:23.414 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:35:23 compute-1 nova_compute[187078]: 2025-11-24 13:35:23.415 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:35:23 compute-1 nova_compute[187078]: 2025-11-24 13:35:23.415 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:35:23 compute-1 nova_compute[187078]: 2025-11-24 13:35:23.415 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:35:23 compute-1 nova_compute[187078]: 2025-11-24 13:35:23.536 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:24 compute-1 podman[215625]: 2025-11-24 13:35:24.560388116 +0000 UTC m=+0.094024097 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:35:24 compute-1 podman[215626]: 2025-11-24 13:35:24.571505448 +0000 UTC m=+0.103196966 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:35:25 compute-1 nova_compute[187078]: 2025-11-24 13:35:25.003 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Updating instance_info_cache with network_info: [{"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:35:25 compute-1 nova_compute[187078]: 2025-11-24 13:35:25.022 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:35:25 compute-1 nova_compute[187078]: 2025-11-24 13:35:25.022 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:35:25 compute-1 nova_compute[187078]: 2025-11-24 13:35:25.022 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:25 compute-1 nova_compute[187078]: 2025-11-24 13:35:25.023 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:35:25 compute-1 nova_compute[187078]: 2025-11-24 13:35:25.907 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:28 compute-1 nova_compute[187078]: 2025-11-24 13:35:28.538 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:29 compute-1 nova_compute[187078]: 2025-11-24 13:35:29.016 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:30 compute-1 nova_compute[187078]: 2025-11-24 13:35:30.909 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:33 compute-1 nova_compute[187078]: 2025-11-24 13:35:33.542 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:34 compute-1 podman[215673]: 2025-11-24 13:35:34.556904273 +0000 UTC m=+0.089033109 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public)
Nov 24 13:35:35 compute-1 podman[197429]: time="2025-11-24T13:35:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:35:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:35:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:35:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:35:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3065 "" "Go-http-client/1.1"
Nov 24 13:35:35 compute-1 nova_compute[187078]: 2025-11-24 13:35:35.912 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:36 compute-1 sshd-session[215694]: Received disconnect from 5.198.176.28 port 45082:11: Bye Bye [preauth]
Nov 24 13:35:36 compute-1 sshd-session[215694]: Disconnected from authenticating user root 5.198.176.28 port 45082 [preauth]
Nov 24 13:35:36 compute-1 ovn_controller[95368]: 2025-11-24T13:35:36Z|00177|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Nov 24 13:35:38 compute-1 nova_compute[187078]: 2025-11-24 13:35:38.543 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:38 compute-1 nova_compute[187078]: 2025-11-24 13:35:38.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:38 compute-1 nova_compute[187078]: 2025-11-24 13:35:38.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:35:39 compute-1 sshd-session[215696]: Received disconnect from 68.183.82.237 port 58766:11: Bye Bye [preauth]
Nov 24 13:35:39 compute-1 sshd-session[215696]: Disconnected from authenticating user root 68.183.82.237 port 58766 [preauth]
Nov 24 13:35:40 compute-1 nova_compute[187078]: 2025-11-24 13:35:40.914 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:41 compute-1 nova_compute[187078]: 2025-11-24 13:35:41.678 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:35:41 compute-1 nova_compute[187078]: 2025-11-24 13:35:41.678 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:35:41 compute-1 nova_compute[187078]: 2025-11-24 13:35:41.700 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:35:43 compute-1 nova_compute[187078]: 2025-11-24 13:35:43.545 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:45 compute-1 nova_compute[187078]: 2025-11-24 13:35:45.917 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:48 compute-1 nova_compute[187078]: 2025-11-24 13:35:48.546 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:49 compute-1 openstack_network_exporter[199599]: ERROR   13:35:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:35:49 compute-1 openstack_network_exporter[199599]: ERROR   13:35:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:35:49 compute-1 openstack_network_exporter[199599]: ERROR   13:35:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:35:49 compute-1 openstack_network_exporter[199599]: ERROR   13:35:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:35:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:35:49 compute-1 openstack_network_exporter[199599]: ERROR   13:35:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:35:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:35:49 compute-1 podman[215698]: 2025-11-24 13:35:49.500636569 +0000 UTC m=+0.051412149 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:35:49 compute-1 podman[215699]: 2025-11-24 13:35:49.500614928 +0000 UTC m=+0.047871202 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 24 13:35:49 compute-1 nova_compute[187078]: 2025-11-24 13:35:49.631 187082 DEBUG nova.compute.manager [None req-dbd97596-4c68-4504-92f8-100af3bbab51 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Nov 24 13:35:49 compute-1 nova_compute[187078]: 2025-11-24 13:35:49.676 187082 DEBUG nova.compute.provider_tree [None req-dbd97596-4c68-4504-92f8-100af3bbab51 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 34 to 36 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:35:50 compute-1 nova_compute[187078]: 2025-11-24 13:35:50.919 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:53 compute-1 nova_compute[187078]: 2025-11-24 13:35:53.612 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:53 compute-1 nova_compute[187078]: 2025-11-24 13:35:53.728 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Check if temp file /var/lib/nova/instances/tmpx9xu9adc exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 24 13:35:53 compute-1 nova_compute[187078]: 2025-11-24 13:35:53.728 187082 DEBUG nova.compute.manager [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx9xu9adc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='64ec0fe3-a86c-4fe4-acd6-a449e46530e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 24 13:35:54 compute-1 nova_compute[187078]: 2025-11-24 13:35:54.430 187082 DEBUG oslo_concurrency.processutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:54 compute-1 sshd-session[215739]: Received disconnect from 176.114.89.34 port 51192:11: Bye Bye [preauth]
Nov 24 13:35:54 compute-1 sshd-session[215739]: Disconnected from authenticating user root 176.114.89.34 port 51192 [preauth]
Nov 24 13:35:54 compute-1 nova_compute[187078]: 2025-11-24 13:35:54.488 187082 DEBUG oslo_concurrency.processutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:54 compute-1 nova_compute[187078]: 2025-11-24 13:35:54.489 187082 DEBUG oslo_concurrency.processutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:35:54 compute-1 nova_compute[187078]: 2025-11-24 13:35:54.549 187082 DEBUG oslo_concurrency.processutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:35:55 compute-1 podman[215747]: 2025-11-24 13:35:55.527936477 +0000 UTC m=+0.071030691 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 13:35:55 compute-1 podman[215748]: 2025-11-24 13:35:55.550527701 +0000 UTC m=+0.088665310 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 24 13:35:55 compute-1 nova_compute[187078]: 2025-11-24 13:35:55.952 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:57 compute-1 sshd-session[215791]: Accepted publickey for nova from 192.168.122.100 port 52168 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:35:57 compute-1 systemd-logind[815]: New session 43 of user nova.
Nov 24 13:35:57 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Nov 24 13:35:57 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 24 13:35:57 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 24 13:35:57 compute-1 systemd[1]: Starting User Manager for UID 42436...
Nov 24 13:35:57 compute-1 systemd[215795]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:35:57 compute-1 systemd[215795]: Queued start job for default target Main User Target.
Nov 24 13:35:57 compute-1 systemd[215795]: Created slice User Application Slice.
Nov 24 13:35:57 compute-1 systemd[215795]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:35:57 compute-1 systemd[215795]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:35:57 compute-1 systemd[215795]: Reached target Paths.
Nov 24 13:35:57 compute-1 systemd[215795]: Reached target Timers.
Nov 24 13:35:57 compute-1 systemd[215795]: Starting D-Bus User Message Bus Socket...
Nov 24 13:35:57 compute-1 systemd[215795]: Starting Create User's Volatile Files and Directories...
Nov 24 13:35:57 compute-1 systemd[215795]: Finished Create User's Volatile Files and Directories.
Nov 24 13:35:57 compute-1 systemd[215795]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:35:57 compute-1 systemd[215795]: Reached target Sockets.
Nov 24 13:35:57 compute-1 systemd[215795]: Reached target Basic System.
Nov 24 13:35:57 compute-1 systemd[215795]: Reached target Main User Target.
Nov 24 13:35:57 compute-1 systemd[215795]: Startup finished in 128ms.
Nov 24 13:35:57 compute-1 systemd[1]: Started User Manager for UID 42436.
Nov 24 13:35:57 compute-1 systemd[1]: Started Session 43 of User nova.
Nov 24 13:35:57 compute-1 sshd-session[215791]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:35:57 compute-1 sshd-session[215810]: Received disconnect from 192.168.122.100 port 52168:11: disconnected by user
Nov 24 13:35:57 compute-1 sshd-session[215810]: Disconnected from user nova 192.168.122.100 port 52168
Nov 24 13:35:57 compute-1 sshd-session[215791]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:35:57 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Nov 24 13:35:57 compute-1 systemd-logind[815]: Session 43 logged out. Waiting for processes to exit.
Nov 24 13:35:57 compute-1 systemd-logind[815]: Removed session 43.
Nov 24 13:35:58 compute-1 nova_compute[187078]: 2025-11-24 13:35:58.614 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:58 compute-1 sshd-session[215789]: Received disconnect from 45.78.217.131 port 56478:11: Bye Bye [preauth]
Nov 24 13:35:58 compute-1 sshd-session[215789]: Disconnected from authenticating user root 45.78.217.131 port 56478 [preauth]
Nov 24 13:35:59 compute-1 nova_compute[187078]: 2025-11-24 13:35:59.683 187082 DEBUG nova.compute.manager [req-87f8903e-c52a-4096-8b2b-a01940ec70ee req-e48bb6a7-6467-4f85-855a-d30ae53626d6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-unplugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:35:59 compute-1 nova_compute[187078]: 2025-11-24 13:35:59.683 187082 DEBUG oslo_concurrency.lockutils [req-87f8903e-c52a-4096-8b2b-a01940ec70ee req-e48bb6a7-6467-4f85-855a-d30ae53626d6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:35:59 compute-1 nova_compute[187078]: 2025-11-24 13:35:59.684 187082 DEBUG oslo_concurrency.lockutils [req-87f8903e-c52a-4096-8b2b-a01940ec70ee req-e48bb6a7-6467-4f85-855a-d30ae53626d6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:35:59 compute-1 nova_compute[187078]: 2025-11-24 13:35:59.684 187082 DEBUG oslo_concurrency.lockutils [req-87f8903e-c52a-4096-8b2b-a01940ec70ee req-e48bb6a7-6467-4f85-855a-d30ae53626d6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:35:59 compute-1 nova_compute[187078]: 2025-11-24 13:35:59.684 187082 DEBUG nova.compute.manager [req-87f8903e-c52a-4096-8b2b-a01940ec70ee req-e48bb6a7-6467-4f85-855a-d30ae53626d6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] No waiting events found dispatching network-vif-unplugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:35:59 compute-1 nova_compute[187078]: 2025-11-24 13:35:59.685 187082 DEBUG nova.compute.manager [req-87f8903e-c52a-4096-8b2b-a01940ec70ee req-e48bb6a7-6467-4f85-855a-d30ae53626d6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-unplugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:35:59 compute-1 nova_compute[187078]: 2025-11-24 13:35:59.757 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:35:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:59.756 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:35:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:35:59.757 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.309 187082 INFO nova.compute.manager [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Took 5.76 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.310 187082 DEBUG nova.compute.manager [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.324 187082 DEBUG nova.compute.manager [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx9xu9adc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='64ec0fe3-a86c-4fe4-acd6-a449e46530e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(04ecb42a-9628-4994-a6fb-df86dc060f57),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.345 187082 DEBUG nova.objects.instance [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.346 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.348 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.348 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.359 187082 DEBUG nova.virt.libvirt.vif [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:35:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1773054992',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1773054992',id=20,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:35:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-wldlv05c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:35:07Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=64ec0fe3-a86c-4fe4-acd6-a449e46530e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.360 187082 DEBUG nova.network.os_vif_util [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.361 187082 DEBUG nova.network.os_vif_util [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:ee:8f,bridge_name='br-int',has_traffic_filtering=True,id=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ddc2d5-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.361 187082 DEBUG nova.virt.libvirt.migration [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Updating guest XML with vif config: <interface type="ethernet">
Nov 24 13:36:00 compute-1 nova_compute[187078]:   <mac address="fa:16:3e:59:ee:8f"/>
Nov 24 13:36:00 compute-1 nova_compute[187078]:   <model type="virtio"/>
Nov 24 13:36:00 compute-1 nova_compute[187078]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:36:00 compute-1 nova_compute[187078]:   <mtu size="1442"/>
Nov 24 13:36:00 compute-1 nova_compute[187078]:   <target dev="tap49ddc2d5-3f"/>
Nov 24 13:36:00 compute-1 nova_compute[187078]: </interface>
Nov 24 13:36:00 compute-1 nova_compute[187078]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.361 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.850 187082 DEBUG nova.virt.libvirt.migration [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.851 187082 INFO nova.virt.libvirt.migration [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.931 187082 INFO nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 24 13:36:00 compute-1 nova_compute[187078]: 2025-11-24 13:36:00.953 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.435 187082 DEBUG nova.virt.libvirt.migration [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.436 187082 DEBUG nova.virt.libvirt.migration [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.749 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991361.7491813, 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.750 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] VM Paused (Lifecycle Event)
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.755 187082 DEBUG nova.compute.manager [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.755 187082 DEBUG oslo_concurrency.lockutils [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.755 187082 DEBUG oslo_concurrency.lockutils [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.755 187082 DEBUG oslo_concurrency.lockutils [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.755 187082 DEBUG nova.compute.manager [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] No waiting events found dispatching network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.756 187082 WARNING nova.compute.manager [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received unexpected event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 for instance with vm_state active and task_state migrating.
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.756 187082 DEBUG nova.compute.manager [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-changed-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.756 187082 DEBUG nova.compute.manager [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Refreshing instance network info cache due to event network-changed-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.756 187082 DEBUG oslo_concurrency.lockutils [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.756 187082 DEBUG oslo_concurrency.lockutils [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.756 187082 DEBUG nova.network.neutron [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Refreshing network info cache for port 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:36:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:01.760 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.780 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.784 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.798 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 24 13:36:01 compute-1 kernel: tap49ddc2d5-3f (unregistering): left promiscuous mode
Nov 24 13:36:01 compute-1 NetworkManager[55527]: <info>  [1763991361.9067] device (tap49ddc2d5-3f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:36:01 compute-1 ovn_controller[95368]: 2025-11-24T13:36:01Z|00178|binding|INFO|Releasing lport 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 from this chassis (sb_readonly=0)
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.947 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:01 compute-1 ovn_controller[95368]: 2025-11-24T13:36:01Z|00179|binding|INFO|Setting lport 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 down in Southbound
Nov 24 13:36:01 compute-1 ovn_controller[95368]: 2025-11-24T13:36:01Z|00180|binding|INFO|Removing iface tap49ddc2d5-3f ovn-installed in OVS
Nov 24 13:36:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:01.956 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:ee:8f 10.100.0.5'], port_security=['fa:16:3e:59:ee:8f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '64ec0fe3-a86c-4fe4-acd6-a449e46530e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:36:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:01.957 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:36:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:01.959 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:36:01 compute-1 nova_compute[187078]: 2025-11-24 13:36:01.960 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:01.960 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9f08c0a2-d94f-46eb-a3c5-76443dd2021b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:01.961 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace which is not needed anymore
Nov 24 13:36:02 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000014.scope: Deactivated successfully.
Nov 24 13:36:02 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000014.scope: Consumed 16.480s CPU time.
Nov 24 13:36:02 compute-1 systemd-machined[153355]: Machine qemu-14-instance-00000014 terminated.
Nov 24 13:36:02 compute-1 NetworkManager[55527]: <info>  [1763991362.1020] manager: (tap49ddc2d5-3f): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Nov 24 13:36:02 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[215544]: [NOTICE]   (215548) : haproxy version is 2.8.14-c23fe91
Nov 24 13:36:02 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[215544]: [NOTICE]   (215548) : path to executable is /usr/sbin/haproxy
Nov 24 13:36:02 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[215544]: [WARNING]  (215548) : Exiting Master process...
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.103 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:02 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[215544]: [ALERT]    (215548) : Current worker (215550) exited with code 143 (Terminated)
Nov 24 13:36:02 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[215544]: [WARNING]  (215548) : All workers exited. Exiting... (0)
Nov 24 13:36:02 compute-1 systemd[1]: libpod-d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046.scope: Deactivated successfully.
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.109 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:02 compute-1 podman[215846]: 2025-11-24 13:36:02.11391579 +0000 UTC m=+0.051527621 container died d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 13:36:02 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046-userdata-shm.mount: Deactivated successfully.
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.147 187082 DEBUG nova.virt.libvirt.guest [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.148 187082 INFO nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Migration operation has completed
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.148 187082 INFO nova.compute.manager [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] _post_live_migration() is started..
Nov 24 13:36:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-8eb380a3b14c6c50c772294f31d37729c71531f27cd82beb6070ffe5ca45f573-merged.mount: Deactivated successfully.
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.154 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.154 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.155 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 24 13:36:02 compute-1 podman[215846]: 2025-11-24 13:36:02.161220935 +0000 UTC m=+0.098832756 container cleanup d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:36:02 compute-1 systemd[1]: libpod-conmon-d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046.scope: Deactivated successfully.
Nov 24 13:36:02 compute-1 podman[215888]: 2025-11-24 13:36:02.222941593 +0000 UTC m=+0.040048389 container remove d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.227 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[c20c0b35-3100-4f75-a070-e90db8191bbb]: (4, ('Mon Nov 24 01:36:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046)\nd2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046\nMon Nov 24 01:36:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (d2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046)\nd2b11bf43efe6184c26ede7f5ede5f07bde028d82b8b825e84f4d1cc0fe12046\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.230 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a939d0-9d6f-4d2d-b642-a0ae8a820bf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.232 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.233 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:02 compute-1 kernel: tapee6bf4e1-a0: left promiscuous mode
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.250 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.253 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[22a9e175-88bd-4a38-afe6-103a4acca1cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.268 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[66814664-f2c7-456e-ac7d-3db8dc373cdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.269 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[80e42935-e4ae-4cf9-8e4d-c72368630109]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.284 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0e838458-f052-45a4-8cae-048a905e6157]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432552, 'reachable_time': 18431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215908, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:02 compute-1 systemd[1]: run-netns-ovnmeta\x2dee6bf4e1\x2dadcd\x2d4f6c\x2d8b46\x2deaa71e64e9c0.mount: Deactivated successfully.
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.288 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:36:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:02.288 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[c40bd841-7de8-44a8-ac54-d74e78e9fd2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.735 187082 DEBUG nova.compute.manager [req-357b0135-da39-46c3-a498-f98b7e177674 req-b2cc772b-e93c-441e-a82e-f232f9825e3e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-unplugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.736 187082 DEBUG oslo_concurrency.lockutils [req-357b0135-da39-46c3-a498-f98b7e177674 req-b2cc772b-e93c-441e-a82e-f232f9825e3e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.736 187082 DEBUG oslo_concurrency.lockutils [req-357b0135-da39-46c3-a498-f98b7e177674 req-b2cc772b-e93c-441e-a82e-f232f9825e3e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.737 187082 DEBUG oslo_concurrency.lockutils [req-357b0135-da39-46c3-a498-f98b7e177674 req-b2cc772b-e93c-441e-a82e-f232f9825e3e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.737 187082 DEBUG nova.compute.manager [req-357b0135-da39-46c3-a498-f98b7e177674 req-b2cc772b-e93c-441e-a82e-f232f9825e3e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] No waiting events found dispatching network-vif-unplugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:36:02 compute-1 nova_compute[187078]: 2025-11-24 13:36:02.737 187082 DEBUG nova.compute.manager [req-357b0135-da39-46c3-a498-f98b7e177674 req-b2cc772b-e93c-441e-a82e-f232f9825e3e 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-unplugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.380 187082 DEBUG nova.network.neutron [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Activated binding for port 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.381 187082 DEBUG nova.compute.manager [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.382 187082 DEBUG nova.virt.libvirt.vif [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:35:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1773054992',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1773054992',id=20,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:35:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-wldlv05c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:35:51Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=64ec0fe3-a86c-4fe4-acd6-a449e46530e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.382 187082 DEBUG nova.network.os_vif_util [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.383 187082 DEBUG nova.network.os_vif_util [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:ee:8f,bridge_name='br-int',has_traffic_filtering=True,id=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ddc2d5-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.383 187082 DEBUG os_vif [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:ee:8f,bridge_name='br-int',has_traffic_filtering=True,id=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ddc2d5-3f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.386 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.386 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49ddc2d5-3f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.388 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.391 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.395 187082 INFO os_vif [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:ee:8f,bridge_name='br-int',has_traffic_filtering=True,id=49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ddc2d5-3f')
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.396 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.397 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.397 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.397 187082 DEBUG nova.compute.manager [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.398 187082 INFO nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Deleting instance files /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0_del
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.399 187082 INFO nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Deletion of /var/lib/nova/instances/64ec0fe3-a86c-4fe4-acd6-a449e46530e0_del complete
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.617 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.746 187082 DEBUG nova.network.neutron [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Updated VIF entry in instance network info cache for port 49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.747 187082 DEBUG nova.network.neutron [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Updating instance_info_cache with network_info: [{"id": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "address": "fa:16:3e:59:ee:8f", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ddc2d5-3f", "ovs_interfaceid": "49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:36:03 compute-1 nova_compute[187078]: 2025-11-24 13:36:03.765 187082 DEBUG oslo_concurrency.lockutils [req-f8cea291-75bb-4b8b-a98e-133470c6fe3a req-e5b40a23-9529-434c-8c9c-2d5adda8dbdd 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-64ec0fe3-a86c-4fe4-acd6-a449e46530e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:36:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:04.165 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:04.166 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:04.166 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.810 187082 DEBUG nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.810 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.811 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.811 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.812 187082 DEBUG nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] No waiting events found dispatching network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.812 187082 WARNING nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received unexpected event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 for instance with vm_state active and task_state migrating.
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.812 187082 DEBUG nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.812 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.812 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.813 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.813 187082 DEBUG nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] No waiting events found dispatching network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.813 187082 WARNING nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received unexpected event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 for instance with vm_state active and task_state migrating.
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.813 187082 DEBUG nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.813 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.814 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.814 187082 DEBUG oslo_concurrency.lockutils [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.814 187082 DEBUG nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] No waiting events found dispatching network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:36:04 compute-1 nova_compute[187078]: 2025-11-24 13:36:04.814 187082 WARNING nova.compute.manager [req-53d0acdc-7af0-4b2e-b4ab-d0f7c8c045bb req-7306abe9-9081-4509-9379-ff3d6b54a1a8 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Received unexpected event network-vif-plugged-49ddc2d5-3f48-4e2f-a6b2-3a8a7d75fb02 for instance with vm_state active and task_state migrating.
Nov 24 13:36:05 compute-1 podman[215909]: 2025-11-24 13:36:05.540729049 +0000 UTC m=+0.075586166 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 13:36:05 compute-1 podman[197429]: time="2025-11-24T13:36:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:36:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:36:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:36:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:36:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 24 13:36:07 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Nov 24 13:36:07 compute-1 systemd[215795]: Activating special unit Exit the Session...
Nov 24 13:36:07 compute-1 systemd[215795]: Stopped target Main User Target.
Nov 24 13:36:07 compute-1 systemd[215795]: Stopped target Basic System.
Nov 24 13:36:07 compute-1 systemd[215795]: Stopped target Paths.
Nov 24 13:36:07 compute-1 systemd[215795]: Stopped target Sockets.
Nov 24 13:36:07 compute-1 systemd[215795]: Stopped target Timers.
Nov 24 13:36:07 compute-1 systemd[215795]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:36:07 compute-1 systemd[215795]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:36:07 compute-1 systemd[215795]: Closed D-Bus User Message Bus Socket.
Nov 24 13:36:07 compute-1 systemd[215795]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:36:07 compute-1 systemd[215795]: Removed slice User Application Slice.
Nov 24 13:36:07 compute-1 systemd[215795]: Reached target Shutdown.
Nov 24 13:36:07 compute-1 systemd[215795]: Finished Exit the Session.
Nov 24 13:36:07 compute-1 systemd[215795]: Reached target Exit the Session.
Nov 24 13:36:07 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Nov 24 13:36:07 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Nov 24 13:36:07 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 24 13:36:07 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 24 13:36:07 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 24 13:36:07 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 24 13:36:07 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.585 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.586 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.586 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "64ec0fe3-a86c-4fe4-acd6-a449e46530e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.606 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.607 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.608 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.608 187082 DEBUG nova.compute.resource_tracker [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.764 187082 WARNING nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.765 187082 DEBUG nova.compute.resource_tracker [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5879MB free_disk=73.46001434326172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.765 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.765 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.800 187082 DEBUG nova.compute.resource_tracker [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration for instance 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.826 187082 DEBUG nova.compute.resource_tracker [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.869 187082 DEBUG nova.compute.resource_tracker [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration 04ecb42a-9628-4994-a6fb-df86dc060f57 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.870 187082 DEBUG nova.compute.resource_tracker [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.870 187082 DEBUG nova.compute.resource_tracker [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.918 187082 DEBUG nova.compute.provider_tree [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.931 187082 DEBUG nova.scheduler.client.report [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.959 187082 DEBUG nova.compute.resource_tracker [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.960 187082 DEBUG oslo_concurrency.lockutils [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:07 compute-1 nova_compute[187078]: 2025-11-24 13:36:07.964 187082 INFO nova.compute.manager [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Nov 24 13:36:08 compute-1 nova_compute[187078]: 2025-11-24 13:36:08.031 187082 INFO nova.scheduler.client.report [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration 04ecb42a-9628-4994-a6fb-df86dc060f57
Nov 24 13:36:08 compute-1 nova_compute[187078]: 2025-11-24 13:36:08.032 187082 DEBUG nova.virt.libvirt.driver [None req-bd2355c4-d32e-462d-86f1-03aa1c272fd3 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 24 13:36:08 compute-1 nova_compute[187078]: 2025-11-24 13:36:08.388 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:08 compute-1 nova_compute[187078]: 2025-11-24 13:36:08.619 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:11 compute-1 sshd-session[215932]: Invalid user ftpuser from 175.100.24.139 port 41626
Nov 24 13:36:11 compute-1 sshd-session[215932]: Received disconnect from 175.100.24.139 port 41626:11: Bye Bye [preauth]
Nov 24 13:36:11 compute-1 sshd-session[215932]: Disconnected from invalid user ftpuser 175.100.24.139 port 41626 [preauth]
Nov 24 13:36:13 compute-1 nova_compute[187078]: 2025-11-24 13:36:13.449 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:13 compute-1 nova_compute[187078]: 2025-11-24 13:36:13.621 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:14 compute-1 nova_compute[187078]: 2025-11-24 13:36:14.689 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:16 compute-1 nova_compute[187078]: 2025-11-24 13:36:16.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:17 compute-1 nova_compute[187078]: 2025-11-24 13:36:17.148 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991362.1471474, 64ec0fe3-a86c-4fe4-acd6-a449e46530e0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:36:17 compute-1 nova_compute[187078]: 2025-11-24 13:36:17.148 187082 INFO nova.compute.manager [-] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] VM Stopped (Lifecycle Event)
Nov 24 13:36:17 compute-1 nova_compute[187078]: 2025-11-24 13:36:17.168 187082 DEBUG nova.compute.manager [None req-b44b3882-7581-4be3-9aa2-de17c64b8531 - - - - - -] [instance: 64ec0fe3-a86c-4fe4-acd6-a449e46530e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:36:18 compute-1 nova_compute[187078]: 2025-11-24 13:36:18.452 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:18 compute-1 nova_compute[187078]: 2025-11-24 13:36:18.623 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:18 compute-1 nova_compute[187078]: 2025-11-24 13:36:18.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:19 compute-1 openstack_network_exporter[199599]: ERROR   13:36:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:36:19 compute-1 openstack_network_exporter[199599]: ERROR   13:36:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:36:19 compute-1 openstack_network_exporter[199599]: ERROR   13:36:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:36:19 compute-1 openstack_network_exporter[199599]: ERROR   13:36:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:36:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:36:19 compute-1 openstack_network_exporter[199599]: ERROR   13:36:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:36:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:36:20 compute-1 podman[215935]: 2025-11-24 13:36:20.550023445 +0000 UTC m=+0.080966171 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:36:20 compute-1 podman[215934]: 2025-11-24 13:36:20.568431326 +0000 UTC m=+0.097210094 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.703 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.704 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.704 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.705 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.947 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.950 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5888MB free_disk=73.46050643920898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.951 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:20 compute-1 nova_compute[187078]: 2025-11-24 13:36:20.952 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:21 compute-1 nova_compute[187078]: 2025-11-24 13:36:21.006 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:36:21 compute-1 nova_compute[187078]: 2025-11-24 13:36:21.007 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:36:21 compute-1 nova_compute[187078]: 2025-11-24 13:36:21.027 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:36:21 compute-1 nova_compute[187078]: 2025-11-24 13:36:21.037 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:36:21 compute-1 nova_compute[187078]: 2025-11-24 13:36:21.038 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:36:21 compute-1 nova_compute[187078]: 2025-11-24 13:36:21.038 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:22 compute-1 nova_compute[187078]: 2025-11-24 13:36:22.039 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:23 compute-1 nova_compute[187078]: 2025-11-24 13:36:23.487 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:23 compute-1 nova_compute[187078]: 2025-11-24 13:36:23.624 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:23 compute-1 nova_compute[187078]: 2025-11-24 13:36:23.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:23 compute-1 nova_compute[187078]: 2025-11-24 13:36:23.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:36:23 compute-1 nova_compute[187078]: 2025-11-24 13:36:23.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:36:23 compute-1 nova_compute[187078]: 2025-11-24 13:36:23.678 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:36:24 compute-1 nova_compute[187078]: 2025-11-24 13:36:24.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:24 compute-1 nova_compute[187078]: 2025-11-24 13:36:24.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:36:26 compute-1 podman[215977]: 2025-11-24 13:36:26.573060767 +0000 UTC m=+0.104192932 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 13:36:26 compute-1 podman[215978]: 2025-11-24 13:36:26.627309522 +0000 UTC m=+0.153293297 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Nov 24 13:36:26 compute-1 nova_compute[187078]: 2025-11-24 13:36:26.661 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:27 compute-1 nova_compute[187078]: 2025-11-24 13:36:27.659 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:36:28 compute-1 nova_compute[187078]: 2025-11-24 13:36:28.490 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:28 compute-1 nova_compute[187078]: 2025-11-24 13:36:28.627 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:33 compute-1 nova_compute[187078]: 2025-11-24 13:36:33.493 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:33 compute-1 nova_compute[187078]: 2025-11-24 13:36:33.629 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:35 compute-1 podman[197429]: time="2025-11-24T13:36:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:36:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:36:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:36:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:36:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Nov 24 13:36:36 compute-1 podman[216023]: 2025-11-24 13:36:36.559477301 +0000 UTC m=+0.087238022 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:36:38 compute-1 nova_compute[187078]: 2025-11-24 13:36:38.540 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:38 compute-1 nova_compute[187078]: 2025-11-24 13:36:38.632 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:42 compute-1 sshd-session[216045]: Invalid user user from 5.198.176.28 port 45184
Nov 24 13:36:43 compute-1 sshd-session[216045]: Received disconnect from 5.198.176.28 port 45184:11: Bye Bye [preauth]
Nov 24 13:36:43 compute-1 sshd-session[216045]: Disconnected from invalid user user 5.198.176.28 port 45184 [preauth]
Nov 24 13:36:43 compute-1 nova_compute[187078]: 2025-11-24 13:36:43.545 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:43 compute-1 nova_compute[187078]: 2025-11-24 13:36:43.635 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:43 compute-1 ovn_controller[95368]: 2025-11-24T13:36:43Z|00181|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Nov 24 13:36:46 compute-1 nova_compute[187078]: 2025-11-24 13:36:46.946 187082 DEBUG nova.compute.manager [None req-a24b3d08-bd6d-4a5b-8301-c33231de7a08 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Nov 24 13:36:46 compute-1 nova_compute[187078]: 2025-11-24 13:36:46.997 187082 DEBUG nova.compute.provider_tree [None req-a24b3d08-bd6d-4a5b-8301-c33231de7a08 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 36 to 39 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:36:48 compute-1 nova_compute[187078]: 2025-11-24 13:36:48.548 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:48 compute-1 nova_compute[187078]: 2025-11-24 13:36:48.637 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:49 compute-1 openstack_network_exporter[199599]: ERROR   13:36:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:36:49 compute-1 openstack_network_exporter[199599]: ERROR   13:36:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:36:49 compute-1 openstack_network_exporter[199599]: ERROR   13:36:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:36:49 compute-1 openstack_network_exporter[199599]: ERROR   13:36:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:36:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:36:49 compute-1 openstack_network_exporter[199599]: ERROR   13:36:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:36:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.633 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "189bc088-6dca-48df-9fc1-eefae5706eac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.634 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.655 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.745 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.746 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.756 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.757 187082 INFO nova.compute.claims [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.905 187082 DEBUG nova.compute.provider_tree [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.924 187082 DEBUG nova.scheduler.client.report [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.951 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.952 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.997 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:36:50 compute-1 nova_compute[187078]: 2025-11-24 13:36:50.997 187082 DEBUG nova.network.neutron [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.053 187082 INFO nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.072 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.195 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.197 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.197 187082 INFO nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Creating image(s)
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.198 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "/var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.198 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.199 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.217 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.301 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.304 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.305 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.332 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.411 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.414 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.456 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.458 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.459 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.496 187082 DEBUG nova.policy [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44609a4d2fa941a4b26d6b27a5d4a6d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a66bcdc071b741ef8709a4608acd6051', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:36:51 compute-1 podman[216057]: 2025-11-24 13:36:51.518512984 +0000 UTC m=+0.057299548 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:36:51 compute-1 podman[216054]: 2025-11-24 13:36:51.526867131 +0000 UTC m=+0.065854500 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.533 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.534 187082 DEBUG nova.virt.disk.api [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Checking if we can resize image /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.534 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.595 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.597 187082 DEBUG nova.virt.disk.api [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Cannot resize image /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.597 187082 DEBUG nova.objects.instance [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'migration_context' on Instance uuid 189bc088-6dca-48df-9fc1-eefae5706eac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.628 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.628 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Ensure instance console log exists: /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.629 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.629 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:51 compute-1 nova_compute[187078]: 2025-11-24 13:36:51.630 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:52 compute-1 nova_compute[187078]: 2025-11-24 13:36:52.557 187082 DEBUG nova.network.neutron [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Successfully created port: 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.593 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.638 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.695 187082 DEBUG nova.network.neutron [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Successfully updated port: 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.746 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.747 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquired lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.747 187082 DEBUG nova.network.neutron [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.776 187082 DEBUG nova.compute.manager [req-7569038f-0234-484e-98ce-8e7e0b245e32 req-abc2f28c-047b-4b60-a854-c4abf1e8c499 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received event network-changed-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.777 187082 DEBUG nova.compute.manager [req-7569038f-0234-484e-98ce-8e7e0b245e32 req-abc2f28c-047b-4b60-a854-c4abf1e8c499 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Refreshing instance network info cache due to event network-changed-4a2d9656-5d3f-497e-afbb-3c6bda5ef790. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.778 187082 DEBUG oslo_concurrency.lockutils [req-7569038f-0234-484e-98ce-8e7e0b245e32 req-abc2f28c-047b-4b60-a854-c4abf1e8c499 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:36:53 compute-1 nova_compute[187078]: 2025-11-24 13:36:53.849 187082 DEBUG nova.network.neutron [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.902 187082 DEBUG nova.network.neutron [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Updating instance_info_cache with network_info: [{"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.917 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Releasing lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.917 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Instance network_info: |[{"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.918 187082 DEBUG oslo_concurrency.lockutils [req-7569038f-0234-484e-98ce-8e7e0b245e32 req-abc2f28c-047b-4b60-a854-c4abf1e8c499 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.918 187082 DEBUG nova.network.neutron [req-7569038f-0234-484e-98ce-8e7e0b245e32 req-abc2f28c-047b-4b60-a854-c4abf1e8c499 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Refreshing network info cache for port 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.921 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Start _get_guest_xml network_info=[{"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.926 187082 WARNING nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.934 187082 DEBUG nova.virt.libvirt.host [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.934 187082 DEBUG nova.virt.libvirt.host [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.937 187082 DEBUG nova.virt.libvirt.host [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.938 187082 DEBUG nova.virt.libvirt.host [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.939 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.940 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.940 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.941 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.941 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.941 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.941 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.942 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.942 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.942 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.942 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.943 187082 DEBUG nova.virt.hardware [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.947 187082 DEBUG nova.virt.libvirt.vif [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:36:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444628942',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444628942',id=21,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-7e7l8eny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:36:51Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=189bc088-6dca-48df-9fc1-eefae5706eac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.947 187082 DEBUG nova.network.os_vif_util [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.948 187082 DEBUG nova.network.os_vif_util [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:52:f1,bridge_name='br-int',has_traffic_filtering=True,id=4a2d9656-5d3f-497e-afbb-3c6bda5ef790,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a2d9656-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.950 187082 DEBUG nova.objects.instance [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'pci_devices' on Instance uuid 189bc088-6dca-48df-9fc1-eefae5706eac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.964 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <uuid>189bc088-6dca-48df-9fc1-eefae5706eac</uuid>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <name>instance-00000015</name>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteStrategies-server-444628942</nova:name>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:36:54</nova:creationTime>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:36:54 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:36:54 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:36:54 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:36:54 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:36:54 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:36:54 compute-1 nova_compute[187078]:         <nova:user uuid="44609a4d2fa941a4b26d6b27a5d4a6d2">tempest-TestExecuteStrategies-392394962-project-member</nova:user>
Nov 24 13:36:54 compute-1 nova_compute[187078]:         <nova:project uuid="a66bcdc071b741ef8709a4608acd6051">tempest-TestExecuteStrategies-392394962</nova:project>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:36:54 compute-1 nova_compute[187078]:         <nova:port uuid="4a2d9656-5d3f-497e-afbb-3c6bda5ef790">
Nov 24 13:36:54 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <system>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <entry name="serial">189bc088-6dca-48df-9fc1-eefae5706eac</entry>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <entry name="uuid">189bc088-6dca-48df-9fc1-eefae5706eac</entry>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </system>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <os>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   </os>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <features>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   </features>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk.config"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:ae:52:f1"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <target dev="tap4a2d9656-5d"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/console.log" append="off"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <video>
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </video>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:36:54 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:36:54 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:36:54 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:36:54 compute-1 nova_compute[187078]: </domain>
Nov 24 13:36:54 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.966 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Preparing to wait for external event network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.966 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.967 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.967 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.969 187082 DEBUG nova.virt.libvirt.vif [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:36:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444628942',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444628942',id=21,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-7e7l8eny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:36:51Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=189bc088-6dca-48df-9fc1-eefae5706eac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.969 187082 DEBUG nova.network.os_vif_util [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.971 187082 DEBUG nova.network.os_vif_util [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:52:f1,bridge_name='br-int',has_traffic_filtering=True,id=4a2d9656-5d3f-497e-afbb-3c6bda5ef790,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a2d9656-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.972 187082 DEBUG os_vif [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:52:f1,bridge_name='br-int',has_traffic_filtering=True,id=4a2d9656-5d3f-497e-afbb-3c6bda5ef790,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a2d9656-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.973 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.973 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.974 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.979 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.979 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d9656-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.980 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a2d9656-5d, col_values=(('external_ids', {'iface-id': '4a2d9656-5d3f-497e-afbb-3c6bda5ef790', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:52:f1', 'vm-uuid': '189bc088-6dca-48df-9fc1-eefae5706eac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:54 compute-1 NetworkManager[55527]: <info>  [1763991414.9903] manager: (tap4a2d9656-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.989 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.993 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:36:54 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.998 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:54.999 187082 INFO os_vif [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:52:f1,bridge_name='br-int',has_traffic_filtering=True,id=4a2d9656-5d3f-497e-afbb-3c6bda5ef790,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a2d9656-5d')
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.053 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.054 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.054 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No VIF found with MAC fa:16:3e:ae:52:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.055 187082 INFO nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Using config drive
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.648 187082 INFO nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Creating config drive at /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk.config
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.653 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6yhv4hjf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.779 187082 DEBUG oslo_concurrency.processutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6yhv4hjf" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:36:55 compute-1 kernel: tap4a2d9656-5d: entered promiscuous mode
Nov 24 13:36:55 compute-1 NetworkManager[55527]: <info>  [1763991415.8792] manager: (tap4a2d9656-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Nov 24 13:36:55 compute-1 ovn_controller[95368]: 2025-11-24T13:36:55Z|00182|binding|INFO|Claiming lport 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 for this chassis.
Nov 24 13:36:55 compute-1 ovn_controller[95368]: 2025-11-24T13:36:55Z|00183|binding|INFO|4a2d9656-5d3f-497e-afbb-3c6bda5ef790: Claiming fa:16:3e:ae:52:f1 10.100.0.10
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.883 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.903 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:52:f1 10.100.0.10'], port_security=['fa:16:3e:ae:52:f1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '189bc088-6dca-48df-9fc1-eefae5706eac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=4a2d9656-5d3f-497e-afbb-3c6bda5ef790) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.905 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.907 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:36:55 compute-1 ovn_controller[95368]: 2025-11-24T13:36:55Z|00184|binding|INFO|Setting lport 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 ovn-installed in OVS
Nov 24 13:36:55 compute-1 ovn_controller[95368]: 2025-11-24T13:36:55Z|00185|binding|INFO|Setting lport 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 up in Southbound
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.919 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:55 compute-1 nova_compute[187078]: 2025-11-24 13:36:55.923 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.923 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[574d2552-1bd7-4d4b-bd0e-193c4f0b7a51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.926 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee6bf4e1-a1 in ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.931 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee6bf4e1-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.931 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[62d20af3-82eb-49b5-833d-71a3a06044a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.933 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d15755-3f7d-4f02-98db-1173df2a2ecf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:55 compute-1 systemd-machined[153355]: New machine qemu-15-instance-00000015.
Nov 24 13:36:55 compute-1 systemd-udevd[216126]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.945 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[d148647b-8502-42d8-af69-96934e921501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:55 compute-1 NetworkManager[55527]: <info>  [1763991415.9523] device (tap4a2d9656-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:36:55 compute-1 NetworkManager[55527]: <info>  [1763991415.9530] device (tap4a2d9656-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:36:55 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-00000015.
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.959 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0aacc939-85d1-4a17-84e7-8f99977e8686]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:55 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:55.994 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[f78d1790-5508-446b-84d5-f8d86ee00084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 NetworkManager[55527]: <info>  [1763991416.0017] manager: (tapee6bf4e1-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Nov 24 13:36:56 compute-1 systemd-udevd[216130]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.000 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f18440-c35e-4d89-ab95-0b7a450d1c58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.029 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[d422c7c5-e449-4c09-916c-dbf15a416bce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.033 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[25229c4a-e86d-4a2a-8aac-4f578c65fbf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 NetworkManager[55527]: <info>  [1763991416.0525] device (tapee6bf4e1-a0): carrier: link connected
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.056 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a16a60-d666-45d0-9210-01d0af5d0342]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.087 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e61ac54a-825f-48f6-a0b7-32074e1c8a00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443471, 'reachable_time': 22650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216158, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.104 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[23f35c0a-225c-4df7-ae87-45e25cf5008c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:5bc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443471, 'tstamp': 443471}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216159, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.127 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f85f15-7bd9-43d9-b980-2ca99f51366c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443471, 'reachable_time': 22650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216160, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.158 187082 DEBUG nova.compute.manager [req-0ddcee33-de42-4b4d-bfd9-7d30924a1e4f req-bb75fc27-ed02-4819-90a2-3553f98d1885 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received event network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.158 187082 DEBUG oslo_concurrency.lockutils [req-0ddcee33-de42-4b4d-bfd9-7d30924a1e4f req-bb75fc27-ed02-4819-90a2-3553f98d1885 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.159 187082 DEBUG oslo_concurrency.lockutils [req-0ddcee33-de42-4b4d-bfd9-7d30924a1e4f req-bb75fc27-ed02-4819-90a2-3553f98d1885 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.159 187082 DEBUG oslo_concurrency.lockutils [req-0ddcee33-de42-4b4d-bfd9-7d30924a1e4f req-bb75fc27-ed02-4819-90a2-3553f98d1885 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.159 187082 DEBUG nova.compute.manager [req-0ddcee33-de42-4b4d-bfd9-7d30924a1e4f req-bb75fc27-ed02-4819-90a2-3553f98d1885 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Processing event network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.162 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e041d542-833d-427a-992b-87170a8ec318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.255 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[755605d2-2912-42f4-adce-7432340959e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.257 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.257 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.258 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:56 compute-1 kernel: tapee6bf4e1-a0: entered promiscuous mode
Nov 24 13:36:56 compute-1 NetworkManager[55527]: <info>  [1763991416.2630] manager: (tapee6bf4e1-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.264 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:36:56 compute-1 ovn_controller[95368]: 2025-11-24T13:36:56Z|00186|binding|INFO|Releasing lport 3f7bb31c-e9f4-4c4a-ad4a-8451f233926d from this chassis (sb_readonly=0)
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.280 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.288 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.289 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.291 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.292 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[42776c81-df11-4c9d-87a5-8f6bbb1c3588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.294 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:36:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:36:56.295 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'env', 'PROCESS_TAG=haproxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.495 187082 DEBUG nova.network.neutron [req-7569038f-0234-484e-98ce-8e7e0b245e32 req-abc2f28c-047b-4b60-a854-c4abf1e8c499 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Updated VIF entry in instance network info cache for port 4a2d9656-5d3f-497e-afbb-3c6bda5ef790. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.497 187082 DEBUG nova.network.neutron [req-7569038f-0234-484e-98ce-8e7e0b245e32 req-abc2f28c-047b-4b60-a854-c4abf1e8c499 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Updating instance_info_cache with network_info: [{"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.512 187082 DEBUG oslo_concurrency.lockutils [req-7569038f-0234-484e-98ce-8e7e0b245e32 req-abc2f28c-047b-4b60-a854-c4abf1e8c499 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.566 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.568 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991416.56804, 189bc088-6dca-48df-9fc1-eefae5706eac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.569 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] VM Started (Lifecycle Event)
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.573 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.579 187082 INFO nova.virt.libvirt.driver [-] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Instance spawned successfully.
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.579 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.593 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.604 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.608 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.608 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.609 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.610 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.610 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.611 187082 DEBUG nova.virt.libvirt.driver [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.640 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.640 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991416.5685296, 189bc088-6dca-48df-9fc1-eefae5706eac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.641 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] VM Paused (Lifecycle Event)
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.661 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.665 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991416.57238, 189bc088-6dca-48df-9fc1-eefae5706eac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.665 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] VM Resumed (Lifecycle Event)
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.681 187082 INFO nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Took 5.49 seconds to spawn the instance on the hypervisor.
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.682 187082 DEBUG nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.683 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.690 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:36:56 compute-1 podman[216201]: 2025-11-24 13:36:56.704013417 +0000 UTC m=+0.054667866 container create 99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.717 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.741 187082 INFO nova.compute.manager [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Took 6.01 seconds to build instance.
Nov 24 13:36:56 compute-1 systemd[1]: Started libpod-conmon-99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b.scope.
Nov 24 13:36:56 compute-1 nova_compute[187078]: 2025-11-24 13:36:56.764 187082 DEBUG oslo_concurrency.lockutils [None req-624b00b8-2158-4133-a4f7-59aea40a6149 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:56 compute-1 podman[216201]: 2025-11-24 13:36:56.67396661 +0000 UTC m=+0.024621079 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:36:56 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:36:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0ecbbac26e7487420006447ab551d06f21f397a065083f6697ab774ae02ec0b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:36:56 compute-1 podman[216201]: 2025-11-24 13:36:56.811441817 +0000 UTC m=+0.162096266 container init 99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 24 13:36:56 compute-1 podman[216215]: 2025-11-24 13:36:56.814620664 +0000 UTC m=+0.079994546 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 13:36:56 compute-1 podman[216201]: 2025-11-24 13:36:56.819104225 +0000 UTC m=+0.169758694 container start 99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 13:36:56 compute-1 podman[216216]: 2025-11-24 13:36:56.84283979 +0000 UTC m=+0.105226911 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:36:56 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[216242]: [NOTICE]   (216263) : New worker (216268) forked
Nov 24 13:36:56 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[216242]: [NOTICE]   (216263) : Loading success.
Nov 24 13:36:57 compute-1 sshd-session[216170]: Received disconnect from 68.183.82.237 port 57968:11: Bye Bye [preauth]
Nov 24 13:36:57 compute-1 sshd-session[216170]: Disconnected from authenticating user root 68.183.82.237 port 57968 [preauth]
Nov 24 13:36:58 compute-1 nova_compute[187078]: 2025-11-24 13:36:58.219 187082 DEBUG nova.compute.manager [req-c3013ecd-03bb-4e68-9f8c-bf7cad5116f4 req-ff065ddc-d810-444e-b684-6b0930700fbc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received event network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:36:58 compute-1 nova_compute[187078]: 2025-11-24 13:36:58.219 187082 DEBUG oslo_concurrency.lockutils [req-c3013ecd-03bb-4e68-9f8c-bf7cad5116f4 req-ff065ddc-d810-444e-b684-6b0930700fbc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:36:58 compute-1 nova_compute[187078]: 2025-11-24 13:36:58.219 187082 DEBUG oslo_concurrency.lockutils [req-c3013ecd-03bb-4e68-9f8c-bf7cad5116f4 req-ff065ddc-d810-444e-b684-6b0930700fbc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:36:58 compute-1 nova_compute[187078]: 2025-11-24 13:36:58.219 187082 DEBUG oslo_concurrency.lockutils [req-c3013ecd-03bb-4e68-9f8c-bf7cad5116f4 req-ff065ddc-d810-444e-b684-6b0930700fbc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:36:58 compute-1 nova_compute[187078]: 2025-11-24 13:36:58.219 187082 DEBUG nova.compute.manager [req-c3013ecd-03bb-4e68-9f8c-bf7cad5116f4 req-ff065ddc-d810-444e-b684-6b0930700fbc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] No waiting events found dispatching network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:36:58 compute-1 nova_compute[187078]: 2025-11-24 13:36:58.220 187082 WARNING nova.compute.manager [req-c3013ecd-03bb-4e68-9f8c-bf7cad5116f4 req-ff065ddc-d810-444e-b684-6b0930700fbc 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received unexpected event network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 for instance with vm_state active and task_state None.
Nov 24 13:36:58 compute-1 nova_compute[187078]: 2025-11-24 13:36:58.641 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:36:59 compute-1 nova_compute[187078]: 2025-11-24 13:36:59.991 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:01 compute-1 sshd-session[216280]: Invalid user rstudio from 176.114.89.34 port 40786
Nov 24 13:37:01 compute-1 sshd-session[216280]: Received disconnect from 176.114.89.34 port 40786:11: Bye Bye [preauth]
Nov 24 13:37:01 compute-1 sshd-session[216280]: Disconnected from invalid user rstudio 176.114.89.34 port 40786 [preauth]
Nov 24 13:37:02 compute-1 sshd-session[216278]: Invalid user radarr from 45.78.194.40 port 34544
Nov 24 13:37:03 compute-1 sshd-session[216278]: Received disconnect from 45.78.194.40 port 34544:11: Bye Bye [preauth]
Nov 24 13:37:03 compute-1 sshd-session[216278]: Disconnected from invalid user radarr 45.78.194.40 port 34544 [preauth]
Nov 24 13:37:03 compute-1 nova_compute[187078]: 2025-11-24 13:37:03.645 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:04.166 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:04.168 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:04.169 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:04 compute-1 nova_compute[187078]: 2025-11-24 13:37:04.995 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:05.244 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:37:05 compute-1 nova_compute[187078]: 2025-11-24 13:37:05.244 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:05 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:05.245 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:37:05 compute-1 podman[197429]: time="2025-11-24T13:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:37:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:37:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 24 13:37:07 compute-1 podman[216283]: 2025-11-24 13:37:07.503528438 +0000 UTC m=+0.052750985 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 24 13:37:08 compute-1 nova_compute[187078]: 2025-11-24 13:37:08.647 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:09 compute-1 nova_compute[187078]: 2025-11-24 13:37:09.998 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:10 compute-1 ovn_controller[95368]: 2025-11-24T13:37:10Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ae:52:f1 10.100.0.10
Nov 24 13:37:10 compute-1 ovn_controller[95368]: 2025-11-24T13:37:10Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ae:52:f1 10.100.0.10
Nov 24 13:37:13 compute-1 nova_compute[187078]: 2025-11-24 13:37:13.650 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:14 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:14.248 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:15 compute-1 nova_compute[187078]: 2025-11-24 13:37:15.001 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:15 compute-1 nova_compute[187078]: 2025-11-24 13:37:15.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:16 compute-1 nova_compute[187078]: 2025-11-24 13:37:16.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:17 compute-1 nova_compute[187078]: 2025-11-24 13:37:17.411 187082 DEBUG nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Creating tmpfile /var/lib/nova/instances/tmpziygodsb to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 24 13:37:17 compute-1 nova_compute[187078]: 2025-11-24 13:37:17.509 187082 DEBUG nova.compute.manager [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpziygodsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 24 13:37:18 compute-1 nova_compute[187078]: 2025-11-24 13:37:18.654 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:18 compute-1 nova_compute[187078]: 2025-11-24 13:37:18.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:19 compute-1 openstack_network_exporter[199599]: ERROR   13:37:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:37:19 compute-1 openstack_network_exporter[199599]: ERROR   13:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:37:19 compute-1 openstack_network_exporter[199599]: ERROR   13:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:37:19 compute-1 openstack_network_exporter[199599]: ERROR   13:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:37:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:37:19 compute-1 openstack_network_exporter[199599]: ERROR   13:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:37:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:37:19 compute-1 nova_compute[187078]: 2025-11-24 13:37:19.804 187082 DEBUG nova.compute.manager [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpziygodsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='83fd919d-e59d-4c12-aa3c-518c524f99af',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 24 13:37:19 compute-1 nova_compute[187078]: 2025-11-24 13:37:19.828 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquiring lock "refresh_cache-83fd919d-e59d-4c12-aa3c-518c524f99af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:37:19 compute-1 nova_compute[187078]: 2025-11-24 13:37:19.829 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquired lock "refresh_cache-83fd919d-e59d-4c12-aa3c-518c524f99af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:37:19 compute-1 nova_compute[187078]: 2025-11-24 13:37:19.829 187082 DEBUG nova.network.neutron [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.003 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.701 187082 DEBUG nova.network.neutron [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Updating instance_info_cache with network_info: [{"id": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "address": "fa:16:3e:b6:9e:10", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc0bec7e-5b", "ovs_interfaceid": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.720 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Releasing lock "refresh_cache-83fd919d-e59d-4c12-aa3c-518c524f99af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.722 187082 DEBUG nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpziygodsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='83fd919d-e59d-4c12-aa3c-518c524f99af',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.723 187082 DEBUG nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Creating instance directory: /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.724 187082 DEBUG nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Creating disk.info with the contents: {'/var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk': 'qcow2', '/var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.725 187082 DEBUG nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.726 187082 DEBUG nova.objects.instance [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 83fd919d-e59d-4c12-aa3c-518c524f99af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.752 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.817 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.819 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.819 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.835 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.924 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.925 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.963 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.964 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:20 compute-1 nova_compute[187078]: 2025-11-24 13:37:20.965 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.035 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.036 187082 DEBUG nova.virt.disk.api [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Checking if we can resize image /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.037 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.091 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.092 187082 DEBUG nova.virt.disk.api [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Cannot resize image /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.093 187082 DEBUG nova.objects.instance [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lazy-loading 'migration_context' on Instance uuid 83fd919d-e59d-4c12-aa3c-518c524f99af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.106 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.131 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.133 187082 DEBUG nova.virt.libvirt.volume.remotefs [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk.config to /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.133 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk.config /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.617 187082 DEBUG oslo_concurrency.processutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af/disk.config /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.618 187082 DEBUG nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.619 187082 DEBUG nova.virt.libvirt.vif [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:37:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-171950899',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-171950899',id=22,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:37:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-vegp8msu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:37:09Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=83fd919d-e59d-4c12-aa3c-518c524f99af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "address": "fa:16:3e:b6:9e:10", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcc0bec7e-5b", "ovs_interfaceid": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.620 187082 DEBUG nova.network.os_vif_util [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Converting VIF {"id": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "address": "fa:16:3e:b6:9e:10", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcc0bec7e-5b", "ovs_interfaceid": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.621 187082 DEBUG nova.network.os_vif_util [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:9e:10,bridge_name='br-int',has_traffic_filtering=True,id=cc0bec7e-5b9e-4a2c-b10b-0335bad942af,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc0bec7e-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.621 187082 DEBUG os_vif [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:9e:10,bridge_name='br-int',has_traffic_filtering=True,id=cc0bec7e-5b9e-4a2c-b10b-0335bad942af,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc0bec7e-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.622 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.622 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.623 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.625 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.625 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc0bec7e-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.626 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc0bec7e-5b, col_values=(('external_ids', {'iface-id': 'cc0bec7e-5b9e-4a2c-b10b-0335bad942af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:9e:10', 'vm-uuid': '83fd919d-e59d-4c12-aa3c-518c524f99af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.675 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.675 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:21 compute-1 NetworkManager[55527]: <info>  [1763991441.6757] manager: (tapcc0bec7e-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.677 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.683 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.683 187082 INFO os_vif [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:9e:10,bridge_name='br-int',has_traffic_filtering=True,id=cc0bec7e-5b9e-4a2c-b10b-0335bad942af,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc0bec7e-5b')
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.684 187082 DEBUG nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 24 13:37:21 compute-1 nova_compute[187078]: 2025-11-24 13:37:21.684 187082 DEBUG nova.compute.manager [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpziygodsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='83fd919d-e59d-4c12-aa3c-518c524f99af',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 24 13:37:22 compute-1 podman[216344]: 2025-11-24 13:37:22.52956348 +0000 UTC m=+0.069873450 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:37:22 compute-1 podman[216343]: 2025-11-24 13:37:22.548610528 +0000 UTC m=+0.083220763 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.693 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.767 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.835 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.836 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:37:22 compute-1 nova_compute[187078]: 2025-11-24 13:37:22.923 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.068 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.069 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5673MB free_disk=73.43000793457031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.069 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.070 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.113 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Migration for instance 83fd919d-e59d-4c12-aa3c-518c524f99af refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.128 187082 INFO nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Updating resource usage from migration 00533e90-210d-4e67-9d05-226d2ecdc960
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.128 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Starting to track incoming migration 00533e90-210d-4e67-9d05-226d2ecdc960 with flavor 9fb1ccae-4ba6-4040-a754-0b156b72dc25 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.169 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 189bc088-6dca-48df-9fc1-eefae5706eac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.185 187082 WARNING nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 83fd919d-e59d-4c12-aa3c-518c524f99af has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.186 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.186 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.203 187082 DEBUG nova.network.neutron [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Port cc0bec7e-5b9e-4a2c-b10b-0335bad942af updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.205 187082 DEBUG nova.compute.manager [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpziygodsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='83fd919d-e59d-4c12-aa3c-518c524f99af',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.251 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.265 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.287 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.287 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:23 compute-1 systemd[1]: Starting libvirt proxy daemon...
Nov 24 13:37:23 compute-1 systemd[1]: Started libvirt proxy daemon.
Nov 24 13:37:23 compute-1 NetworkManager[55527]: <info>  [1763991443.5245] manager: (tapcc0bec7e-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Nov 24 13:37:23 compute-1 kernel: tapcc0bec7e-5b: entered promiscuous mode
Nov 24 13:37:23 compute-1 systemd-udevd[216420]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:37:23 compute-1 ovn_controller[95368]: 2025-11-24T13:37:23Z|00187|binding|INFO|Claiming lport cc0bec7e-5b9e-4a2c-b10b-0335bad942af for this additional chassis.
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.573 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:23 compute-1 ovn_controller[95368]: 2025-11-24T13:37:23Z|00188|binding|INFO|cc0bec7e-5b9e-4a2c-b10b-0335bad942af: Claiming fa:16:3e:b6:9e:10 10.100.0.5
Nov 24 13:37:23 compute-1 ovn_controller[95368]: 2025-11-24T13:37:23Z|00189|binding|INFO|Setting lport cc0bec7e-5b9e-4a2c-b10b-0335bad942af ovn-installed in OVS
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.588 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:23 compute-1 NetworkManager[55527]: <info>  [1763991443.5934] device (tapcc0bec7e-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:37:23 compute-1 NetworkManager[55527]: <info>  [1763991443.5945] device (tapcc0bec7e-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:37:23 compute-1 systemd-machined[153355]: New machine qemu-16-instance-00000016.
Nov 24 13:37:23 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-00000016.
Nov 24 13:37:23 compute-1 nova_compute[187078]: 2025-11-24 13:37:23.654 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:24 compute-1 nova_compute[187078]: 2025-11-24 13:37:24.493 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991444.4935386, 83fd919d-e59d-4c12-aa3c-518c524f99af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:37:24 compute-1 nova_compute[187078]: 2025-11-24 13:37:24.494 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] VM Started (Lifecycle Event)
Nov 24 13:37:24 compute-1 nova_compute[187078]: 2025-11-24 13:37:24.512 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.280 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991445.2799366, 83fd919d-e59d-4c12-aa3c-518c524f99af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.280 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] VM Resumed (Lifecycle Event)
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.286 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.286 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.298 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.302 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.329 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.669 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.669 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.670 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.911 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.911 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.911 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:37:25 compute-1 nova_compute[187078]: 2025-11-24 13:37:25.912 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 189bc088-6dca-48df-9fc1-eefae5706eac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:37:26 compute-1 ovn_controller[95368]: 2025-11-24T13:37:26Z|00190|binding|INFO|Claiming lport cc0bec7e-5b9e-4a2c-b10b-0335bad942af for this chassis.
Nov 24 13:37:26 compute-1 ovn_controller[95368]: 2025-11-24T13:37:26Z|00191|binding|INFO|cc0bec7e-5b9e-4a2c-b10b-0335bad942af: Claiming fa:16:3e:b6:9e:10 10.100.0.5
Nov 24 13:37:26 compute-1 ovn_controller[95368]: 2025-11-24T13:37:26Z|00192|binding|INFO|Setting lport cc0bec7e-5b9e-4a2c-b10b-0335bad942af up in Southbound
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.308 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:9e:10 10.100.0.5'], port_security=['fa:16:3e:b6:9e:10 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '83fd919d-e59d-4c12-aa3c-518c524f99af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '11', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=cc0bec7e-5b9e-4a2c-b10b-0335bad942af) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.310 104225 INFO neutron.agent.ovn.metadata.agent [-] Port cc0bec7e-5b9e-4a2c-b10b-0335bad942af in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.313 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.339 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ec884d09-2a3e-4c4b-b860-dab59820afb2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.381 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ac52c8-f066-4e4e-a4fb-a0b71ac442bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.385 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[3865f2fb-a7a9-4e0f-be9b-d642a3bf8820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.436 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[c99027c5-2095-42ce-a2b5-35691ad0f6c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:26 compute-1 nova_compute[187078]: 2025-11-24 13:37:26.450 187082 INFO nova.compute.manager [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Post operation of migration started
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.464 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed56bbd-b3b8-4193-a615-e59e0f909d93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443471, 'reachable_time': 22650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216457, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.493 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9633c8b4-8648-4230-a0f1-68c5bfc53cbb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee6bf4e1-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443487, 'tstamp': 443487}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216458, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee6bf4e1-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443491, 'tstamp': 443491}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216458, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.495 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:26 compute-1 nova_compute[187078]: 2025-11-24 13:37:26.498 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:26 compute-1 nova_compute[187078]: 2025-11-24 13:37:26.499 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.500 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.501 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.502 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:26 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:26.502 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:37:26 compute-1 nova_compute[187078]: 2025-11-24 13:37:26.675 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:26 compute-1 nova_compute[187078]: 2025-11-24 13:37:26.705 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquiring lock "refresh_cache-83fd919d-e59d-4c12-aa3c-518c524f99af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:37:26 compute-1 nova_compute[187078]: 2025-11-24 13:37:26.706 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquired lock "refresh_cache-83fd919d-e59d-4c12-aa3c-518c524f99af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:37:26 compute-1 nova_compute[187078]: 2025-11-24 13:37:26.706 187082 DEBUG nova.network.neutron [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.024 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Updating instance_info_cache with network_info: [{"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.039 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-189bc088-6dca-48df-9fc1-eefae5706eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.039 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:37:27 compute-1 podman[216459]: 2025-11-24 13:37:27.556666978 +0000 UTC m=+0.089945926 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 13:37:27 compute-1 podman[216460]: 2025-11-24 13:37:27.625679543 +0000 UTC m=+0.157081590 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.736 187082 DEBUG nova.network.neutron [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Updating instance_info_cache with network_info: [{"id": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "address": "fa:16:3e:b6:9e:10", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc0bec7e-5b", "ovs_interfaceid": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.755 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Releasing lock "refresh_cache-83fd919d-e59d-4c12-aa3c-518c524f99af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.769 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.770 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.770 187082 DEBUG oslo_concurrency.lockutils [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:27 compute-1 nova_compute[187078]: 2025-11-24 13:37:27.775 187082 INFO nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 24 13:37:27 compute-1 virtqemud[186628]: Domain id=16 name='instance-00000016' uuid=83fd919d-e59d-4c12-aa3c-518c524f99af is tainted: custom-monitor
Nov 24 13:37:28 compute-1 nova_compute[187078]: 2025-11-24 13:37:28.708 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:28 compute-1 nova_compute[187078]: 2025-11-24 13:37:28.784 187082 INFO nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 24 13:37:29 compute-1 nova_compute[187078]: 2025-11-24 13:37:29.030 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:37:29 compute-1 nova_compute[187078]: 2025-11-24 13:37:29.791 187082 INFO nova.virt.libvirt.driver [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 24 13:37:29 compute-1 nova_compute[187078]: 2025-11-24 13:37:29.797 187082 DEBUG nova.compute.manager [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:37:29 compute-1 nova_compute[187078]: 2025-11-24 13:37:29.816 187082 DEBUG nova.objects.instance [None req-12037514-8113-4f78-b9e0-668c01e927f9 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 24 13:37:31 compute-1 sshd-session[216505]: Invalid user sol from 45.148.10.240 port 39972
Nov 24 13:37:31 compute-1 sshd-session[216505]: Connection closed by invalid user sol 45.148.10.240 port 39972 [preauth]
Nov 24 13:37:31 compute-1 nova_compute[187078]: 2025-11-24 13:37:31.677 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:33 compute-1 nova_compute[187078]: 2025-11-24 13:37:33.710 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 podman[197429]: time="2025-11-24T13:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:37:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:37:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.678 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "83fd919d-e59d-4c12-aa3c-518c524f99af" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.679 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "83fd919d-e59d-4c12-aa3c-518c524f99af" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.679 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.680 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.680 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.681 187082 INFO nova.compute.manager [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Terminating instance
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.682 187082 DEBUG nova.compute.manager [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:37:35 compute-1 kernel: tapcc0bec7e-5b (unregistering): left promiscuous mode
Nov 24 13:37:35 compute-1 NetworkManager[55527]: <info>  [1763991455.7041] device (tapcc0bec7e-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:37:35 compute-1 ovn_controller[95368]: 2025-11-24T13:37:35Z|00193|binding|INFO|Releasing lport cc0bec7e-5b9e-4a2c-b10b-0335bad942af from this chassis (sb_readonly=0)
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.711 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 ovn_controller[95368]: 2025-11-24T13:37:35Z|00194|binding|INFO|Setting lport cc0bec7e-5b9e-4a2c-b10b-0335bad942af down in Southbound
Nov 24 13:37:35 compute-1 ovn_controller[95368]: 2025-11-24T13:37:35Z|00195|binding|INFO|Removing iface tapcc0bec7e-5b ovn-installed in OVS
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.713 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.717 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:9e:10 10.100.0.5'], port_security=['fa:16:3e:b6:9e:10 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '83fd919d-e59d-4c12-aa3c-518c524f99af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '13', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=cc0bec7e-5b9e-4a2c-b10b-0335bad942af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.718 104225 INFO neutron.agent.ovn.metadata.agent [-] Port cc0bec7e-5b9e-4a2c-b10b-0335bad942af in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.720 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.725 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.742 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e0ec8b89-3ada-4f5a-956e-8bdc5d5a5498]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:35 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Deactivated successfully.
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.769 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[24ff6bf5-b37e-4dcc-878b-22d35eae2816]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:35 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Consumed 1.912s CPU time.
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.772 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c4a1a2-0701-4d7d-b853-ab44ba108bf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:35 compute-1 systemd-machined[153355]: Machine qemu-16-instance-00000016 terminated.
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.799 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[68390ffe-7c93-4de8-9ced-e49552a1a8ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.814 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8778ad91-3aec-41b6-8c24-d758840a116d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 7, 'rx_bytes': 868, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 7, 'rx_bytes': 868, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443471, 'reachable_time': 22650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216521, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.829 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a373d009-ad41-40b4-9126-8f452e65146f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee6bf4e1-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443487, 'tstamp': 443487}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216522, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee6bf4e1-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443491, 'tstamp': 443491}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216522, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.831 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.832 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.837 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.837 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.838 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.838 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:35 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:35.839 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.908 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.915 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.966 187082 INFO nova.virt.libvirt.driver [-] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Instance destroyed successfully.
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.967 187082 DEBUG nova.objects.instance [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'resources' on Instance uuid 83fd919d-e59d-4c12-aa3c-518c524f99af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.981 187082 DEBUG nova.virt.libvirt.vif [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-24T13:37:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-171950899',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-171950899',id=22,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:37:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-vegp8msu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:37:29Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=83fd919d-e59d-4c12-aa3c-518c524f99af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "address": "fa:16:3e:b6:9e:10", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc0bec7e-5b", "ovs_interfaceid": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.981 187082 DEBUG nova.network.os_vif_util [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "address": "fa:16:3e:b6:9e:10", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc0bec7e-5b", "ovs_interfaceid": "cc0bec7e-5b9e-4a2c-b10b-0335bad942af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.982 187082 DEBUG nova.network.os_vif_util [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b6:9e:10,bridge_name='br-int',has_traffic_filtering=True,id=cc0bec7e-5b9e-4a2c-b10b-0335bad942af,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc0bec7e-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.983 187082 DEBUG os_vif [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:9e:10,bridge_name='br-int',has_traffic_filtering=True,id=cc0bec7e-5b9e-4a2c-b10b-0335bad942af,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc0bec7e-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.984 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.985 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc0bec7e-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.986 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.990 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.992 187082 INFO os_vif [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:9e:10,bridge_name='br-int',has_traffic_filtering=True,id=cc0bec7e-5b9e-4a2c-b10b-0335bad942af,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc0bec7e-5b')
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.993 187082 INFO nova.virt.libvirt.driver [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Deleting instance files /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af_del
Nov 24 13:37:35 compute-1 nova_compute[187078]: 2025-11-24 13:37:35.994 187082 INFO nova.virt.libvirt.driver [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Deletion of /var/lib/nova/instances/83fd919d-e59d-4c12-aa3c-518c524f99af_del complete
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.038 187082 INFO nova.compute.manager [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.039 187082 DEBUG oslo.service.loopingcall [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.040 187082 DEBUG nova.compute.manager [-] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.040 187082 DEBUG nova.network.neutron [-] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.582 187082 DEBUG nova.compute.manager [req-68bbe753-f316-43fd-a680-9f9047f49c44 req-db3b4e10-b1ee-4705-b6c1-d4709f2df0f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Received event network-vif-unplugged-cc0bec7e-5b9e-4a2c-b10b-0335bad942af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.583 187082 DEBUG oslo_concurrency.lockutils [req-68bbe753-f316-43fd-a680-9f9047f49c44 req-db3b4e10-b1ee-4705-b6c1-d4709f2df0f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.584 187082 DEBUG oslo_concurrency.lockutils [req-68bbe753-f316-43fd-a680-9f9047f49c44 req-db3b4e10-b1ee-4705-b6c1-d4709f2df0f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.584 187082 DEBUG oslo_concurrency.lockutils [req-68bbe753-f316-43fd-a680-9f9047f49c44 req-db3b4e10-b1ee-4705-b6c1-d4709f2df0f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.585 187082 DEBUG nova.compute.manager [req-68bbe753-f316-43fd-a680-9f9047f49c44 req-db3b4e10-b1ee-4705-b6c1-d4709f2df0f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] No waiting events found dispatching network-vif-unplugged-cc0bec7e-5b9e-4a2c-b10b-0335bad942af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:37:36 compute-1 nova_compute[187078]: 2025-11-24 13:37:36.585 187082 DEBUG nova.compute.manager [req-68bbe753-f316-43fd-a680-9f9047f49c44 req-db3b4e10-b1ee-4705-b6c1-d4709f2df0f6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Received event network-vif-unplugged-cc0bec7e-5b9e-4a2c-b10b-0335bad942af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:37:37 compute-1 sshd-session[216508]: Connection closed by authenticating user root 80.94.95.115 port 24480 [preauth]
Nov 24 13:37:37 compute-1 nova_compute[187078]: 2025-11-24 13:37:37.648 187082 DEBUG nova.network.neutron [-] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:37:37 compute-1 nova_compute[187078]: 2025-11-24 13:37:37.672 187082 INFO nova.compute.manager [-] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Took 1.63 seconds to deallocate network for instance.
Nov 24 13:37:37 compute-1 nova_compute[187078]: 2025-11-24 13:37:37.729 187082 DEBUG nova.compute.manager [req-dbae816e-92d4-484b-a181-83c1c9b4dc9f req-0e137da1-2082-4bb2-8e3c-d83be75cac8d 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Received event network-vif-deleted-cc0bec7e-5b9e-4a2c-b10b-0335bad942af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:37:37 compute-1 nova_compute[187078]: 2025-11-24 13:37:37.739 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:37 compute-1 nova_compute[187078]: 2025-11-24 13:37:37.739 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:37 compute-1 nova_compute[187078]: 2025-11-24 13:37:37.744 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:37 compute-1 nova_compute[187078]: 2025-11-24 13:37:37.780 187082 INFO nova.scheduler.client.report [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Deleted allocations for instance 83fd919d-e59d-4c12-aa3c-518c524f99af
Nov 24 13:37:37 compute-1 nova_compute[187078]: 2025-11-24 13:37:37.844 187082 DEBUG oslo_concurrency.lockutils [None req-2b524543-c777-4cb2-bc27-3acf075d246f 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "83fd919d-e59d-4c12-aa3c-518c524f99af" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.032 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "189bc088-6dca-48df-9fc1-eefae5706eac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.032 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.033 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.033 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.034 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.036 187082 INFO nova.compute.manager [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Terminating instance
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.037 187082 DEBUG nova.compute.manager [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:37:38 compute-1 kernel: tap4a2d9656-5d (unregistering): left promiscuous mode
Nov 24 13:37:38 compute-1 NetworkManager[55527]: <info>  [1763991458.0710] device (tap4a2d9656-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:37:38 compute-1 ovn_controller[95368]: 2025-11-24T13:37:38Z|00196|binding|INFO|Releasing lport 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 from this chassis (sb_readonly=0)
Nov 24 13:37:38 compute-1 ovn_controller[95368]: 2025-11-24T13:37:38Z|00197|binding|INFO|Setting lport 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 down in Southbound
Nov 24 13:37:38 compute-1 ovn_controller[95368]: 2025-11-24T13:37:38Z|00198|binding|INFO|Removing iface tap4a2d9656-5d ovn-installed in OVS
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.130 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.138 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:52:f1 10.100.0.10'], port_security=['fa:16:3e:ae:52:f1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '189bc088-6dca-48df-9fc1-eefae5706eac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=4a2d9656-5d3f-497e-afbb-3c6bda5ef790) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.139 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 4a2d9656-5d3f-497e-afbb-3c6bda5ef790 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.141 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.142 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[227b94e6-4ce5-46f0-8094-b1555e7c9b8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.142 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.143 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace which is not needed anymore
Nov 24 13:37:38 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Deactivated successfully.
Nov 24 13:37:38 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Consumed 15.114s CPU time.
Nov 24 13:37:38 compute-1 systemd-machined[153355]: Machine qemu-15-instance-00000015 terminated.
Nov 24 13:37:38 compute-1 podman[216544]: 2025-11-24 13:37:38.240106144 +0000 UTC m=+0.071671248 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 13:37:38 compute-1 NetworkManager[55527]: <info>  [1763991458.2538] manager: (tap4a2d9656-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.256 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.258 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:38 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[216242]: [NOTICE]   (216263) : haproxy version is 2.8.14-c23fe91
Nov 24 13:37:38 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[216242]: [NOTICE]   (216263) : path to executable is /usr/sbin/haproxy
Nov 24 13:37:38 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[216242]: [WARNING]  (216263) : Exiting Master process...
Nov 24 13:37:38 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[216242]: [ALERT]    (216263) : Current worker (216268) exited with code 143 (Terminated)
Nov 24 13:37:38 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[216242]: [WARNING]  (216263) : All workers exited. Exiting... (0)
Nov 24 13:37:38 compute-1 systemd[1]: libpod-99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b.scope: Deactivated successfully.
Nov 24 13:37:38 compute-1 podman[216579]: 2025-11-24 13:37:38.273622415 +0000 UTC m=+0.044890571 container died 99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:37:38 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b-userdata-shm.mount: Deactivated successfully.
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.302 187082 INFO nova.virt.libvirt.driver [-] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Instance destroyed successfully.
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.303 187082 DEBUG nova.objects.instance [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'resources' on Instance uuid 189bc088-6dca-48df-9fc1-eefae5706eac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:37:38 compute-1 systemd[1]: var-lib-containers-storage-overlay-a0ecbbac26e7487420006447ab551d06f21f397a065083f6697ab774ae02ec0b-merged.mount: Deactivated successfully.
Nov 24 13:37:38 compute-1 podman[216579]: 2025-11-24 13:37:38.317011465 +0000 UTC m=+0.088279631 container cleanup 99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.320 187082 DEBUG nova.virt.libvirt.vif [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:36:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444628942',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444628942',id=21,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:36:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-7e7l8eny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:36:56Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=189bc088-6dca-48df-9fc1-eefae5706eac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.320 187082 DEBUG nova.network.os_vif_util [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "address": "fa:16:3e:ae:52:f1", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a2d9656-5d", "ovs_interfaceid": "4a2d9656-5d3f-497e-afbb-3c6bda5ef790", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.321 187082 DEBUG nova.network.os_vif_util [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:52:f1,bridge_name='br-int',has_traffic_filtering=True,id=4a2d9656-5d3f-497e-afbb-3c6bda5ef790,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a2d9656-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.322 187082 DEBUG os_vif [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:52:f1,bridge_name='br-int',has_traffic_filtering=True,id=4a2d9656-5d3f-497e-afbb-3c6bda5ef790,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a2d9656-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.324 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.324 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d9656-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:38 compute-1 systemd[1]: libpod-conmon-99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b.scope: Deactivated successfully.
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.329 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.331 187082 INFO os_vif [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:52:f1,bridge_name='br-int',has_traffic_filtering=True,id=4a2d9656-5d3f-497e-afbb-3c6bda5ef790,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a2d9656-5d')
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.332 187082 INFO nova.virt.libvirt.driver [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Deleting instance files /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac_del
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.333 187082 INFO nova.virt.libvirt.driver [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Deletion of /var/lib/nova/instances/189bc088-6dca-48df-9fc1-eefae5706eac_del complete
Nov 24 13:37:38 compute-1 podman[216628]: 2025-11-24 13:37:38.382208676 +0000 UTC m=+0.042097485 container remove 99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.388 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3be93e36-df2b-47f1-b6bb-aa1fe1be39d9]: (4, ('Mon Nov 24 01:37:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b)\n99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b\nMon Nov 24 01:37:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b)\n99945cf3e8c2abf41d2d241374be1f3027f865bdf3b3f7addf29f422693eb75b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.390 187082 INFO nova.compute.manager [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.390 187082 DEBUG oslo.service.loopingcall [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.390 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[d72e47fd-2d88-4f03-9b0f-3dd928a575f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.391 187082 DEBUG nova.compute.manager [-] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.391 187082 DEBUG nova.network.neutron [-] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.391 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.393 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:38 compute-1 kernel: tapee6bf4e1-a0: left promiscuous mode
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.404 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.408 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[82185d55-c328-43c9-9e1c-da3bda7c2535]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.422 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a12513be-12ba-4497-a72b-9d4ff287e9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.424 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[24ae3868-327b-4751-a807-14765e792980]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.437 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[33831845-4a97-4660-974e-718d54fbe800]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443464, 'reachable_time': 38588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216644, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.439 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:37:38 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:37:38.440 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[68ff504c-2c1f-4b32-9b5a-07d195707a2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:37:38 compute-1 systemd[1]: run-netns-ovnmeta\x2dee6bf4e1\x2dadcd\x2d4f6c\x2d8b46\x2deaa71e64e9c0.mount: Deactivated successfully.
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.684 187082 DEBUG nova.compute.manager [req-a926f9b1-220d-418a-ada1-4ea1a6dc1de6 req-6ea61a2f-9ee8-4df7-b79a-f3710d1c3ea9 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Received event network-vif-plugged-cc0bec7e-5b9e-4a2c-b10b-0335bad942af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.684 187082 DEBUG oslo_concurrency.lockutils [req-a926f9b1-220d-418a-ada1-4ea1a6dc1de6 req-6ea61a2f-9ee8-4df7-b79a-f3710d1c3ea9 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.685 187082 DEBUG oslo_concurrency.lockutils [req-a926f9b1-220d-418a-ada1-4ea1a6dc1de6 req-6ea61a2f-9ee8-4df7-b79a-f3710d1c3ea9 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.685 187082 DEBUG oslo_concurrency.lockutils [req-a926f9b1-220d-418a-ada1-4ea1a6dc1de6 req-6ea61a2f-9ee8-4df7-b79a-f3710d1c3ea9 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "83fd919d-e59d-4c12-aa3c-518c524f99af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.686 187082 DEBUG nova.compute.manager [req-a926f9b1-220d-418a-ada1-4ea1a6dc1de6 req-6ea61a2f-9ee8-4df7-b79a-f3710d1c3ea9 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] No waiting events found dispatching network-vif-plugged-cc0bec7e-5b9e-4a2c-b10b-0335bad942af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.686 187082 WARNING nova.compute.manager [req-a926f9b1-220d-418a-ada1-4ea1a6dc1de6 req-6ea61a2f-9ee8-4df7-b79a-f3710d1c3ea9 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Received unexpected event network-vif-plugged-cc0bec7e-5b9e-4a2c-b10b-0335bad942af for instance with vm_state deleted and task_state None.
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.712 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.896 187082 DEBUG nova.network.neutron [-] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.911 187082 INFO nova.compute.manager [-] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Took 0.52 seconds to deallocate network for instance.
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.944 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.945 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.984 187082 DEBUG nova.compute.provider_tree [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:37:38 compute-1 nova_compute[187078]: 2025-11-24 13:37:38.995 187082 DEBUG nova.scheduler.client.report [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.012 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.056 187082 INFO nova.scheduler.client.report [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Deleted allocations for instance 189bc088-6dca-48df-9fc1-eefae5706eac
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.121 187082 DEBUG oslo_concurrency.lockutils [None req-1baaf4ee-572f-489e-9582-ff4e4b3f2590 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.814 187082 DEBUG nova.compute.manager [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received event network-vif-unplugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.816 187082 DEBUG oslo_concurrency.lockutils [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.817 187082 DEBUG oslo_concurrency.lockutils [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.817 187082 DEBUG oslo_concurrency.lockutils [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.817 187082 DEBUG nova.compute.manager [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] No waiting events found dispatching network-vif-unplugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.818 187082 WARNING nova.compute.manager [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received unexpected event network-vif-unplugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 for instance with vm_state deleted and task_state None.
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.818 187082 DEBUG nova.compute.manager [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received event network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.818 187082 DEBUG oslo_concurrency.lockutils [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.819 187082 DEBUG oslo_concurrency.lockutils [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.819 187082 DEBUG oslo_concurrency.lockutils [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "189bc088-6dca-48df-9fc1-eefae5706eac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.819 187082 DEBUG nova.compute.manager [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] No waiting events found dispatching network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.820 187082 WARNING nova.compute.manager [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received unexpected event network-vif-plugged-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 for instance with vm_state deleted and task_state None.
Nov 24 13:37:39 compute-1 nova_compute[187078]: 2025-11-24 13:37:39.820 187082 DEBUG nova.compute.manager [req-7a579eb4-9571-4c5a-8245-387abfec5f49 req-456d0fe2-1662-4042-94a3-85b898b9ba12 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Received event network-vif-deleted-4a2d9656-5d3f-497e-afbb-3c6bda5ef790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:37:43 compute-1 nova_compute[187078]: 2025-11-24 13:37:43.327 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:43 compute-1 nova_compute[187078]: 2025-11-24 13:37:43.758 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:48 compute-1 nova_compute[187078]: 2025-11-24 13:37:48.330 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:48 compute-1 nova_compute[187078]: 2025-11-24 13:37:48.761 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:49 compute-1 openstack_network_exporter[199599]: ERROR   13:37:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:37:49 compute-1 openstack_network_exporter[199599]: ERROR   13:37:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:37:49 compute-1 openstack_network_exporter[199599]: ERROR   13:37:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:37:49 compute-1 openstack_network_exporter[199599]: ERROR   13:37:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:37:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:37:49 compute-1 openstack_network_exporter[199599]: ERROR   13:37:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:37:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:37:50 compute-1 nova_compute[187078]: 2025-11-24 13:37:50.965 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991455.9643898, 83fd919d-e59d-4c12-aa3c-518c524f99af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:37:50 compute-1 nova_compute[187078]: 2025-11-24 13:37:50.966 187082 INFO nova.compute.manager [-] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] VM Stopped (Lifecycle Event)
Nov 24 13:37:50 compute-1 nova_compute[187078]: 2025-11-24 13:37:50.989 187082 DEBUG nova.compute.manager [None req-87939952-5e7c-4f78-b0d4-c46fabf05e44 - - - - - -] [instance: 83fd919d-e59d-4c12-aa3c-518c524f99af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:37:51 compute-1 sshd-session[216645]: Invalid user sens from 175.100.24.139 port 43834
Nov 24 13:37:51 compute-1 sshd-session[216645]: Received disconnect from 175.100.24.139 port 43834:11: Bye Bye [preauth]
Nov 24 13:37:51 compute-1 sshd-session[216645]: Disconnected from invalid user sens 175.100.24.139 port 43834 [preauth]
Nov 24 13:37:53 compute-1 nova_compute[187078]: 2025-11-24 13:37:53.296 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991458.2947457, 189bc088-6dca-48df-9fc1-eefae5706eac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:37:53 compute-1 nova_compute[187078]: 2025-11-24 13:37:53.296 187082 INFO nova.compute.manager [-] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] VM Stopped (Lifecycle Event)
Nov 24 13:37:53 compute-1 nova_compute[187078]: 2025-11-24 13:37:53.311 187082 DEBUG nova.compute.manager [None req-55cfd685-e31c-4baa-a294-73b4143c14f2 - - - - - -] [instance: 189bc088-6dca-48df-9fc1-eefae5706eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:37:53 compute-1 nova_compute[187078]: 2025-11-24 13:37:53.333 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:53 compute-1 podman[216647]: 2025-11-24 13:37:53.526016248 +0000 UTC m=+0.068730529 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:37:53 compute-1 podman[216648]: 2025-11-24 13:37:53.569229432 +0000 UTC m=+0.105526459 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 24 13:37:53 compute-1 nova_compute[187078]: 2025-11-24 13:37:53.765 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:58 compute-1 nova_compute[187078]: 2025-11-24 13:37:58.335 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:37:58 compute-1 podman[216692]: 2025-11-24 13:37:58.560485147 +0000 UTC m=+0.094956872 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:37:58 compute-1 podman[216693]: 2025-11-24 13:37:58.591952222 +0000 UTC m=+0.123165388 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 13:37:58 compute-1 nova_compute[187078]: 2025-11-24 13:37:58.768 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:01 compute-1 sshd-session[216737]: Invalid user node from 193.32.162.146 port 50972
Nov 24 13:38:01 compute-1 sshd-session[216737]: Connection closed by invalid user node 193.32.162.146 port 50972 [preauth]
Nov 24 13:38:03 compute-1 nova_compute[187078]: 2025-11-24 13:38:03.342 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:03 compute-1 nova_compute[187078]: 2025-11-24 13:38:03.771 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:04.167 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:04.168 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:04.168 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:05 compute-1 podman[197429]: time="2025-11-24T13:38:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:38:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:38:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:38:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:38:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 24 13:38:08 compute-1 nova_compute[187078]: 2025-11-24 13:38:08.347 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:08 compute-1 podman[216739]: 2025-11-24 13:38:08.479915129 +0000 UTC m=+0.088107066 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Nov 24 13:38:08 compute-1 nova_compute[187078]: 2025-11-24 13:38:08.816 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:08 compute-1 ovn_controller[95368]: 2025-11-24T13:38:08Z|00199|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 24 13:38:11 compute-1 sshd-session[216760]: Invalid user copia from 176.114.89.34 port 58416
Nov 24 13:38:11 compute-1 sshd-session[216760]: Received disconnect from 176.114.89.34 port 58416:11: Bye Bye [preauth]
Nov 24 13:38:11 compute-1 sshd-session[216760]: Disconnected from invalid user copia 176.114.89.34 port 58416 [preauth]
Nov 24 13:38:13 compute-1 nova_compute[187078]: 2025-11-24 13:38:13.350 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:13 compute-1 nova_compute[187078]: 2025-11-24 13:38:13.866 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:15 compute-1 nova_compute[187078]: 2025-11-24 13:38:15.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:17 compute-1 nova_compute[187078]: 2025-11-24 13:38:17.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:18 compute-1 nova_compute[187078]: 2025-11-24 13:38:18.352 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:18 compute-1 sshd-session[216762]: Invalid user leo from 68.183.82.237 port 57770
Nov 24 13:38:18 compute-1 sshd-session[216762]: Received disconnect from 68.183.82.237 port 57770:11: Bye Bye [preauth]
Nov 24 13:38:18 compute-1 sshd-session[216762]: Disconnected from invalid user leo 68.183.82.237 port 57770 [preauth]
Nov 24 13:38:18 compute-1 nova_compute[187078]: 2025-11-24 13:38:18.868 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:19 compute-1 openstack_network_exporter[199599]: ERROR   13:38:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:38:19 compute-1 openstack_network_exporter[199599]: ERROR   13:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:38:19 compute-1 openstack_network_exporter[199599]: ERROR   13:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:38:19 compute-1 openstack_network_exporter[199599]: ERROR   13:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:38:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:38:19 compute-1 openstack_network_exporter[199599]: ERROR   13:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:38:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:38:19 compute-1 nova_compute[187078]: 2025-11-24 13:38:19.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:21 compute-1 nova_compute[187078]: 2025-11-24 13:38:21.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:22 compute-1 nova_compute[187078]: 2025-11-24 13:38:22.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:23 compute-1 nova_compute[187078]: 2025-11-24 13:38:23.383 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:23 compute-1 sshd-session[216764]: Received disconnect from 5.198.176.28 port 45292:11: Bye Bye [preauth]
Nov 24 13:38:23 compute-1 sshd-session[216764]: Disconnected from authenticating user root 5.198.176.28 port 45292 [preauth]
Nov 24 13:38:23 compute-1 nova_compute[187078]: 2025-11-24 13:38:23.870 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:24 compute-1 podman[216766]: 2025-11-24 13:38:24.520657064 +0000 UTC m=+0.065287917 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:38:24 compute-1 podman[216767]: 2025-11-24 13:38:24.546754446 +0000 UTC m=+0.076819427 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.690 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.690 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.691 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.898 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.899 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5873MB free_disk=73.45932388305664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.900 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.900 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.964 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.964 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:38:24 compute-1 nova_compute[187078]: 2025-11-24 13:38:24.992 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:38:25 compute-1 nova_compute[187078]: 2025-11-24 13:38:25.004 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:38:25 compute-1 nova_compute[187078]: 2025-11-24 13:38:25.025 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:38:25 compute-1 nova_compute[187078]: 2025-11-24 13:38:25.025 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:28 compute-1 nova_compute[187078]: 2025-11-24 13:38:28.028 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:28 compute-1 nova_compute[187078]: 2025-11-24 13:38:28.029 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:38:28 compute-1 nova_compute[187078]: 2025-11-24 13:38:28.029 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:38:28 compute-1 nova_compute[187078]: 2025-11-24 13:38:28.043 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:38:28 compute-1 nova_compute[187078]: 2025-11-24 13:38:28.386 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:28 compute-1 nova_compute[187078]: 2025-11-24 13:38:28.872 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:29 compute-1 podman[216812]: 2025-11-24 13:38:29.529892299 +0000 UTC m=+0.066115200 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:38:29 compute-1 podman[216813]: 2025-11-24 13:38:29.62028772 +0000 UTC m=+0.151680940 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 13:38:29 compute-1 nova_compute[187078]: 2025-11-24 13:38:29.674 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:30 compute-1 nova_compute[187078]: 2025-11-24 13:38:30.659 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:38:33 compute-1 nova_compute[187078]: 2025-11-24 13:38:33.401 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:33 compute-1 nova_compute[187078]: 2025-11-24 13:38:33.874 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:35 compute-1 podman[197429]: time="2025-11-24T13:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:38:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:38:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 24 13:38:38 compute-1 nova_compute[187078]: 2025-11-24 13:38:38.403 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:38 compute-1 nova_compute[187078]: 2025-11-24 13:38:38.920 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:39 compute-1 podman[216859]: 2025-11-24 13:38:39.538250295 +0000 UTC m=+0.075551823 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 24 13:38:43 compute-1 nova_compute[187078]: 2025-11-24 13:38:43.406 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:43 compute-1 nova_compute[187078]: 2025-11-24 13:38:43.962 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.745 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.745 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.757 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.818 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.818 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.824 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.825 187082 INFO nova.compute.claims [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.921 187082 DEBUG nova.compute.provider_tree [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.931 187082 DEBUG nova.scheduler.client.report [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.947 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.948 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.983 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.983 187082 DEBUG nova.network.neutron [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:38:45 compute-1 nova_compute[187078]: 2025-11-24 13:38:45.998 187082 INFO nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.015 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.090 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.092 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.093 187082 INFO nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Creating image(s)
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.094 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "/var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.094 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.095 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "/var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.115 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.144 187082 DEBUG nova.policy [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '44609a4d2fa941a4b26d6b27a5d4a6d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a66bcdc071b741ef8709a4608acd6051', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.209 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.211 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.212 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.230 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.320 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.321 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.355 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.356 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.357 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.409 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.409 187082 DEBUG nova.virt.disk.api [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Checking if we can resize image /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.410 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.466 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.467 187082 DEBUG nova.virt.disk.api [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Cannot resize image /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.467 187082 DEBUG nova.objects.instance [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'migration_context' on Instance uuid 32131561-653e-44d4-9108-c4a2b0328dbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.482 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.483 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Ensure instance console log exists: /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.483 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.483 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.483 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:46 compute-1 nova_compute[187078]: 2025-11-24 13:38:46.829 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:46.830 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:38:46 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:46.832 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.057 187082 DEBUG nova.network.neutron [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Successfully created port: 6784f44d-c942-41a3-89a4-d1e14ae39574 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.691 187082 DEBUG nova.network.neutron [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Successfully updated port: 6784f44d-c942-41a3-89a4-d1e14ae39574 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.705 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.706 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquired lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.706 187082 DEBUG nova.network.neutron [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.761 187082 DEBUG nova.compute.manager [req-ee52d816-1c7b-40c3-ba9c-f537f01cae83 req-6920438a-fad3-4a7d-ad7a-df38122d4790 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-changed-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.762 187082 DEBUG nova.compute.manager [req-ee52d816-1c7b-40c3-ba9c-f537f01cae83 req-6920438a-fad3-4a7d-ad7a-df38122d4790 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Refreshing instance network info cache due to event network-changed-6784f44d-c942-41a3-89a4-d1e14ae39574. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.762 187082 DEBUG oslo_concurrency.lockutils [req-ee52d816-1c7b-40c3-ba9c-f537f01cae83 req-6920438a-fad3-4a7d-ad7a-df38122d4790 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:38:47 compute-1 nova_compute[187078]: 2025-11-24 13:38:47.831 187082 DEBUG nova.network.neutron [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.410 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.481 187082 DEBUG nova.network.neutron [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Updating instance_info_cache with network_info: [{"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.505 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Releasing lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.506 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Instance network_info: |[{"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.507 187082 DEBUG oslo_concurrency.lockutils [req-ee52d816-1c7b-40c3-ba9c-f537f01cae83 req-6920438a-fad3-4a7d-ad7a-df38122d4790 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.507 187082 DEBUG nova.network.neutron [req-ee52d816-1c7b-40c3-ba9c-f537f01cae83 req-6920438a-fad3-4a7d-ad7a-df38122d4790 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Refreshing network info cache for port 6784f44d-c942-41a3-89a4-d1e14ae39574 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.513 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Start _get_guest_xml network_info=[{"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.519 187082 WARNING nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.529 187082 DEBUG nova.virt.libvirt.host [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.530 187082 DEBUG nova.virt.libvirt.host [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.534 187082 DEBUG nova.virt.libvirt.host [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.535 187082 DEBUG nova.virt.libvirt.host [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.537 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.538 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.539 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.539 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.540 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.540 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.541 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.541 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.542 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.542 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.543 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.543 187082 DEBUG nova.virt.hardware [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.549 187082 DEBUG nova.virt.libvirt.vif [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:38:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-25191881',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-25191881',id=23,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-9g80qya3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:38:46Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=32131561-653e-44d4-9108-c4a2b0328dbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.550 187082 DEBUG nova.network.os_vif_util [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.551 187082 DEBUG nova.network.os_vif_util [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:0c:06,bridge_name='br-int',has_traffic_filtering=True,id=6784f44d-c942-41a3-89a4-d1e14ae39574,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6784f44d-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.553 187082 DEBUG nova.objects.instance [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lazy-loading 'pci_devices' on Instance uuid 32131561-653e-44d4-9108-c4a2b0328dbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.565 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <uuid>32131561-653e-44d4-9108-c4a2b0328dbd</uuid>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <name>instance-00000017</name>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteStrategies-server-25191881</nova:name>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:38:48</nova:creationTime>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:38:48 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:38:48 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:38:48 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:38:48 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:38:48 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:38:48 compute-1 nova_compute[187078]:         <nova:user uuid="44609a4d2fa941a4b26d6b27a5d4a6d2">tempest-TestExecuteStrategies-392394962-project-member</nova:user>
Nov 24 13:38:48 compute-1 nova_compute[187078]:         <nova:project uuid="a66bcdc071b741ef8709a4608acd6051">tempest-TestExecuteStrategies-392394962</nova:project>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:38:48 compute-1 nova_compute[187078]:         <nova:port uuid="6784f44d-c942-41a3-89a4-d1e14ae39574">
Nov 24 13:38:48 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <system>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <entry name="serial">32131561-653e-44d4-9108-c4a2b0328dbd</entry>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <entry name="uuid">32131561-653e-44d4-9108-c4a2b0328dbd</entry>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </system>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <os>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   </os>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <features>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   </features>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk.config"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:64:0c:06"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <target dev="tap6784f44d-c9"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/console.log" append="off"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <video>
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </video>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:38:48 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:38:48 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:38:48 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:38:48 compute-1 nova_compute[187078]: </domain>
Nov 24 13:38:48 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.567 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Preparing to wait for external event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.568 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.568 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.569 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.570 187082 DEBUG nova.virt.libvirt.vif [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:38:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-25191881',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-25191881',id=23,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-9g80qya3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:38:46Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=32131561-653e-44d4-9108-c4a2b0328dbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.570 187082 DEBUG nova.network.os_vif_util [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converting VIF {"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.572 187082 DEBUG nova.network.os_vif_util [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:0c:06,bridge_name='br-int',has_traffic_filtering=True,id=6784f44d-c942-41a3-89a4-d1e14ae39574,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6784f44d-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.572 187082 DEBUG os_vif [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:0c:06,bridge_name='br-int',has_traffic_filtering=True,id=6784f44d-c942-41a3-89a4-d1e14ae39574,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6784f44d-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.573 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.574 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.574 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.578 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.579 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6784f44d-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.580 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6784f44d-c9, col_values=(('external_ids', {'iface-id': '6784f44d-c942-41a3-89a4-d1e14ae39574', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:0c:06', 'vm-uuid': '32131561-653e-44d4-9108-c4a2b0328dbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.582 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:48 compute-1 NetworkManager[55527]: <info>  [1763991528.5842] manager: (tap6784f44d-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.585 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.593 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.594 187082 INFO os_vif [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:0c:06,bridge_name='br-int',has_traffic_filtering=True,id=6784f44d-c942-41a3-89a4-d1e14ae39574,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6784f44d-c9')
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.642 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.642 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.643 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] No VIF found with MAC fa:16:3e:64:0c:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.643 187082 INFO nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Using config drive
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.903 187082 INFO nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Creating config drive at /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk.config
Nov 24 13:38:48 compute-1 nova_compute[187078]: 2025-11-24 13:38:48.912 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmperppvdu6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.004 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.045 187082 DEBUG oslo_concurrency.processutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmperppvdu6" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:38:49 compute-1 kernel: tap6784f44d-c9: entered promiscuous mode
Nov 24 13:38:49 compute-1 NetworkManager[55527]: <info>  [1763991529.1337] manager: (tap6784f44d-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Nov 24 13:38:49 compute-1 ovn_controller[95368]: 2025-11-24T13:38:49Z|00200|binding|INFO|Claiming lport 6784f44d-c942-41a3-89a4-d1e14ae39574 for this chassis.
Nov 24 13:38:49 compute-1 ovn_controller[95368]: 2025-11-24T13:38:49Z|00201|binding|INFO|6784f44d-c942-41a3-89a4-d1e14ae39574: Claiming fa:16:3e:64:0c:06 10.100.0.10
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.133 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.159 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:0c:06 10.100.0.10'], port_security=['fa:16:3e:64:0c:06 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '32131561-653e-44d4-9108-c4a2b0328dbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=6784f44d-c942-41a3-89a4-d1e14ae39574) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.160 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 6784f44d-c942-41a3-89a4-d1e14ae39574 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 bound to our chassis
Nov 24 13:38:49 compute-1 ovn_controller[95368]: 2025-11-24T13:38:49Z|00202|binding|INFO|Setting lport 6784f44d-c942-41a3-89a4-d1e14ae39574 ovn-installed in OVS
Nov 24 13:38:49 compute-1 ovn_controller[95368]: 2025-11-24T13:38:49Z|00203|binding|INFO|Setting lport 6784f44d-c942-41a3-89a4-d1e14ae39574 up in Southbound
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.162 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.163 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.164 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.180 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[61d3f577-70bf-45d3-b19e-69329f9a217e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.181 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee6bf4e1-a1 in ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.184 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee6bf4e1-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.184 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8bd89c-90ab-456e-98a8-2c0bafa246d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.185 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[be982f8e-f770-40d7-b9cc-d073865ae1b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 systemd-udevd[216915]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:38:49 compute-1 systemd-machined[153355]: New machine qemu-17-instance-00000017.
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.201 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[8177202e-60de-4e93-a2d6-256c6da7006b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-00000017.
Nov 24 13:38:49 compute-1 NetworkManager[55527]: <info>  [1763991529.2137] device (tap6784f44d-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:38:49 compute-1 NetworkManager[55527]: <info>  [1763991529.2159] device (tap6784f44d-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.231 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[85993099-4015-4bbb-b8b7-0575af610fbc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.276 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[15d1793c-2c83-42cd-9da7-25dae16bbcff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 systemd-udevd[216919]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:38:49 compute-1 NetworkManager[55527]: <info>  [1763991529.2887] manager: (tapee6bf4e1-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.289 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9f85a15a-6cfb-4e3a-a9ff-ea7345ff345a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.333 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[7853db91-ee5b-42e2-9fc5-e2469e6906d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.337 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[894f12f3-ae7c-4fb4-a419-2c536f01952d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 NetworkManager[55527]: <info>  [1763991529.3677] device (tapee6bf4e1-a0): carrier: link connected
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.374 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[143d91a5-6edb-49d0-90f3-38444f6aa158]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.392 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3a37a23f-d04d-455d-a4c9-94d8b9dc85b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454802, 'reachable_time': 27368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216947, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.408 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[42f25838-1466-4d35-919c-b3693a3b95d5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:5bc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454802, 'tstamp': 454802}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216948, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 openstack_network_exporter[199599]: ERROR   13:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:38:49 compute-1 openstack_network_exporter[199599]: ERROR   13:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:38:49 compute-1 openstack_network_exporter[199599]: ERROR   13:38:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:38:49 compute-1 openstack_network_exporter[199599]: ERROR   13:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:38:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:38:49 compute-1 openstack_network_exporter[199599]: ERROR   13:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:38:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.432 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[59c92644-0dfb-4aa3-bd6c-b0d15c86ce99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee6bf4e1-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:5b:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454802, 'reachable_time': 27368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216950, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.474 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[106a42bb-6843-4ca0-9a9a-5b9d2d3e7750]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.524 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991529.5237353, 32131561-653e-44d4-9108-c4a2b0328dbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.524 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] VM Started (Lifecycle Event)
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.552 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.555 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[79fee3a6-88fa-4ef3-9ec2-36fbd226fefd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.557 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991529.5247636, 32131561-653e-44d4-9108-c4a2b0328dbd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.557 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] VM Paused (Lifecycle Event)
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.558 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.558 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.559 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee6bf4e1-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:38:49 compute-1 NetworkManager[55527]: <info>  [1763991529.5625] manager: (tapee6bf4e1-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 24 13:38:49 compute-1 kernel: tapee6bf4e1-a0: entered promiscuous mode
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.564 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.566 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee6bf4e1-a0, col_values=(('external_ids', {'iface-id': '3f7bb31c-e9f4-4c4a-ad4a-8451f233926d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.568 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:49 compute-1 ovn_controller[95368]: 2025-11-24T13:38:49Z|00204|binding|INFO|Releasing lport 3f7bb31c-e9f4-4c4a-ad4a-8451f233926d from this chassis (sb_readonly=0)
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.568 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.569 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.569 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[34cf797b-39fd-4b5b-8deb-de2aa0b89e2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.570 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.pid.haproxy
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:38:49 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:49.571 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'env', 'PROCESS_TAG=haproxy-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.579 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.580 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.583 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.599 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.919 187082 DEBUG nova.network.neutron [req-ee52d816-1c7b-40c3-ba9c-f537f01cae83 req-6920438a-fad3-4a7d-ad7a-df38122d4790 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Updated VIF entry in instance network info cache for port 6784f44d-c942-41a3-89a4-d1e14ae39574. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.920 187082 DEBUG nova.network.neutron [req-ee52d816-1c7b-40c3-ba9c-f537f01cae83 req-6920438a-fad3-4a7d-ad7a-df38122d4790 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Updating instance_info_cache with network_info: [{"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:38:49 compute-1 nova_compute[187078]: 2025-11-24 13:38:49.934 187082 DEBUG oslo_concurrency.lockutils [req-ee52d816-1c7b-40c3-ba9c-f537f01cae83 req-6920438a-fad3-4a7d-ad7a-df38122d4790 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:38:49 compute-1 podman[216988]: 2025-11-24 13:38:49.944064151 +0000 UTC m=+0.052425532 container create 5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:38:49 compute-1 systemd[1]: Started libpod-conmon-5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6.scope.
Nov 24 13:38:50 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:38:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f3447e99e661791f07702b2df16b317ed2d9eb2019a6abfa0a1dfd53ad27ec6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:38:50 compute-1 podman[216988]: 2025-11-24 13:38:49.916726155 +0000 UTC m=+0.025087586 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:38:50 compute-1 podman[216988]: 2025-11-24 13:38:50.015999596 +0000 UTC m=+0.124361007 container init 5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:38:50 compute-1 podman[216988]: 2025-11-24 13:38:50.020683521 +0000 UTC m=+0.129044902 container start 5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 24 13:38:50 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[217003]: [NOTICE]   (217007) : New worker (217009) forked
Nov 24 13:38:50 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[217003]: [NOTICE]   (217007) : Loading success.
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.064 187082 DEBUG nova.compute.manager [req-ebb1ff38-f43b-4716-8730-37afae2c7c45 req-a89e2ed6-2391-4db9-a438-c23eded665c6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.064 187082 DEBUG oslo_concurrency.lockutils [req-ebb1ff38-f43b-4716-8730-37afae2c7c45 req-a89e2ed6-2391-4db9-a438-c23eded665c6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.065 187082 DEBUG oslo_concurrency.lockutils [req-ebb1ff38-f43b-4716-8730-37afae2c7c45 req-a89e2ed6-2391-4db9-a438-c23eded665c6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.065 187082 DEBUG oslo_concurrency.lockutils [req-ebb1ff38-f43b-4716-8730-37afae2c7c45 req-a89e2ed6-2391-4db9-a438-c23eded665c6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.066 187082 DEBUG nova.compute.manager [req-ebb1ff38-f43b-4716-8730-37afae2c7c45 req-a89e2ed6-2391-4db9-a438-c23eded665c6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Processing event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.067 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.072 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991530.0726864, 32131561-653e-44d4-9108-c4a2b0328dbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.073 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] VM Resumed (Lifecycle Event)
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.075 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.079 187082 INFO nova.virt.libvirt.driver [-] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Instance spawned successfully.
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.079 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.104 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.108 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.111 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.112 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.112 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.112 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.113 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.113 187082 DEBUG nova.virt.libvirt.driver [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.141 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.180 187082 INFO nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Took 4.09 seconds to spawn the instance on the hypervisor.
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.181 187082 DEBUG nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.242 187082 INFO nova.compute.manager [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Took 4.45 seconds to build instance.
Nov 24 13:38:50 compute-1 nova_compute[187078]: 2025-11-24 13:38:50.257 187082 DEBUG oslo_concurrency.lockutils [None req-a0c4040f-b9c2-47eb-ac62-0b17b5dd13ba 44609a4d2fa941a4b26d6b27a5d4a6d2 a66bcdc071b741ef8709a4608acd6051 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:52 compute-1 nova_compute[187078]: 2025-11-24 13:38:52.329 187082 DEBUG nova.compute.manager [req-fadbac27-a0f6-4fbb-b4d4-89d57fa52fc9 req-d998168b-adfa-4249-a2b3-9b5137de3ec2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:38:52 compute-1 nova_compute[187078]: 2025-11-24 13:38:52.330 187082 DEBUG oslo_concurrency.lockutils [req-fadbac27-a0f6-4fbb-b4d4-89d57fa52fc9 req-d998168b-adfa-4249-a2b3-9b5137de3ec2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:38:52 compute-1 nova_compute[187078]: 2025-11-24 13:38:52.331 187082 DEBUG oslo_concurrency.lockutils [req-fadbac27-a0f6-4fbb-b4d4-89d57fa52fc9 req-d998168b-adfa-4249-a2b3-9b5137de3ec2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:38:52 compute-1 nova_compute[187078]: 2025-11-24 13:38:52.331 187082 DEBUG oslo_concurrency.lockutils [req-fadbac27-a0f6-4fbb-b4d4-89d57fa52fc9 req-d998168b-adfa-4249-a2b3-9b5137de3ec2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:38:52 compute-1 nova_compute[187078]: 2025-11-24 13:38:52.332 187082 DEBUG nova.compute.manager [req-fadbac27-a0f6-4fbb-b4d4-89d57fa52fc9 req-d998168b-adfa-4249-a2b3-9b5137de3ec2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:38:52 compute-1 nova_compute[187078]: 2025-11-24 13:38:52.332 187082 WARNING nova.compute.manager [req-fadbac27-a0f6-4fbb-b4d4-89d57fa52fc9 req-d998168b-adfa-4249-a2b3-9b5137de3ec2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received unexpected event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with vm_state active and task_state None.
Nov 24 13:38:52 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:38:52.836 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:38:53 compute-1 nova_compute[187078]: 2025-11-24 13:38:53.583 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:54 compute-1 nova_compute[187078]: 2025-11-24 13:38:54.006 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:55 compute-1 podman[217018]: 2025-11-24 13:38:55.517031596 +0000 UTC m=+0.062664266 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:38:55 compute-1 podman[217019]: 2025-11-24 13:38:55.56215783 +0000 UTC m=+0.090821964 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 24 13:38:58 compute-1 nova_compute[187078]: 2025-11-24 13:38:58.586 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:38:59 compute-1 nova_compute[187078]: 2025-11-24 13:38:59.009 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:00 compute-1 podman[217060]: 2025-11-24 13:39:00.550286086 +0000 UTC m=+0.095191641 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 24 13:39:00 compute-1 podman[217061]: 2025-11-24 13:39:00.571926918 +0000 UTC m=+0.114131810 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 13:39:03 compute-1 nova_compute[187078]: 2025-11-24 13:39:03.589 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:03 compute-1 ovn_controller[95368]: 2025-11-24T13:39:03Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:64:0c:06 10.100.0.10
Nov 24 13:39:03 compute-1 ovn_controller[95368]: 2025-11-24T13:39:03Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:64:0c:06 10.100.0.10
Nov 24 13:39:04 compute-1 nova_compute[187078]: 2025-11-24 13:39:04.052 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:04.168 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:04.168 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:04.169 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:05 compute-1 podman[197429]: time="2025-11-24T13:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:39:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:39:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 24 13:39:08 compute-1 nova_compute[187078]: 2025-11-24 13:39:08.591 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:09 compute-1 nova_compute[187078]: 2025-11-24 13:39:09.055 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:10 compute-1 podman[217117]: 2025-11-24 13:39:10.544482814 +0000 UTC m=+0.082733526 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:39:13 compute-1 nova_compute[187078]: 2025-11-24 13:39:13.594 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:14 compute-1 nova_compute[187078]: 2025-11-24 13:39:14.059 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:14 compute-1 sshd-session[217139]: Connection closed by 87.236.176.82 port 40479
Nov 24 13:39:14 compute-1 sshd-session[217140]: Connection closed by 87.236.176.82 port 42623 [preauth]
Nov 24 13:39:17 compute-1 nova_compute[187078]: 2025-11-24 13:39:17.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:18 compute-1 nova_compute[187078]: 2025-11-24 13:39:18.648 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:19 compute-1 nova_compute[187078]: 2025-11-24 13:39:19.061 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:19 compute-1 openstack_network_exporter[199599]: ERROR   13:39:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:39:19 compute-1 openstack_network_exporter[199599]: ERROR   13:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:39:19 compute-1 openstack_network_exporter[199599]: ERROR   13:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:39:19 compute-1 openstack_network_exporter[199599]: ERROR   13:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:39:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:39:19 compute-1 openstack_network_exporter[199599]: ERROR   13:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:39:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:39:19 compute-1 nova_compute[187078]: 2025-11-24 13:39:19.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:20 compute-1 nova_compute[187078]: 2025-11-24 13:39:20.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:21 compute-1 nova_compute[187078]: 2025-11-24 13:39:21.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:23 compute-1 nova_compute[187078]: 2025-11-24 13:39:23.665 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:23 compute-1 nova_compute[187078]: 2025-11-24 13:39:23.697 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:24 compute-1 nova_compute[187078]: 2025-11-24 13:39:24.064 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:25 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:25 compute-1 podman[217144]: 2025-11-24 13:39:25.691228949 +0000 UTC m=+0.052506744 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.690 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.692 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:39:25 compute-1 podman[217143]: 2025-11-24 13:39:25.723802344 +0000 UTC m=+0.096040053 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.758 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.822 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.825 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:39:25 compute-1 nova_compute[187078]: 2025-11-24 13:39:25.891 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.066 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.067 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5683MB free_disk=73.43038177490234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.068 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.068 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.158 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 32131561-653e-44d4-9108-c4a2b0328dbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.158 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.158 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.182 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.204 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.205 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.219 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.273 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.372 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.385 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.408 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:39:26 compute-1 nova_compute[187078]: 2025-11-24 13:39:26.409 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:27 compute-1 nova_compute[187078]: 2025-11-24 13:39:27.410 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:27 compute-1 nova_compute[187078]: 2025-11-24 13:39:27.411 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:39:27 compute-1 nova_compute[187078]: 2025-11-24 13:39:27.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:27 compute-1 nova_compute[187078]: 2025-11-24 13:39:27.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:39:27 compute-1 nova_compute[187078]: 2025-11-24 13:39:27.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:39:28 compute-1 nova_compute[187078]: 2025-11-24 13:39:28.471 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:39:28 compute-1 nova_compute[187078]: 2025-11-24 13:39:28.472 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:39:28 compute-1 nova_compute[187078]: 2025-11-24 13:39:28.472 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:39:28 compute-1 nova_compute[187078]: 2025-11-24 13:39:28.472 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 32131561-653e-44d4-9108-c4a2b0328dbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:39:28 compute-1 ovn_controller[95368]: 2025-11-24T13:39:28Z|00205|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Nov 24 13:39:28 compute-1 nova_compute[187078]: 2025-11-24 13:39:28.703 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:29 compute-1 nova_compute[187078]: 2025-11-24 13:39:29.066 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:29 compute-1 sshd-session[217193]: Invalid user casaos from 175.100.24.139 port 46090
Nov 24 13:39:29 compute-1 sshd-session[217193]: Received disconnect from 175.100.24.139 port 46090:11: Bye Bye [preauth]
Nov 24 13:39:29 compute-1 sshd-session[217193]: Disconnected from invalid user casaos 175.100.24.139 port 46090 [preauth]
Nov 24 13:39:31 compute-1 podman[217195]: 2025-11-24 13:39:31.561655434 +0000 UTC m=+0.093849885 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:39:31 compute-1 podman[217196]: 2025-11-24 13:39:31.587113198 +0000 UTC m=+0.123151353 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:39:31 compute-1 nova_compute[187078]: 2025-11-24 13:39:31.668 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Updating instance_info_cache with network_info: [{"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:39:31 compute-1 nova_compute[187078]: 2025-11-24 13:39:31.690 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:39:31 compute-1 nova_compute[187078]: 2025-11-24 13:39:31.691 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:39:33 compute-1 sshd-session[217240]: Invalid user zmarin from 5.198.176.28 port 45408
Nov 24 13:39:33 compute-1 nova_compute[187078]: 2025-11-24 13:39:33.705 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:33 compute-1 sshd-session[217240]: Received disconnect from 5.198.176.28 port 45408:11: Bye Bye [preauth]
Nov 24 13:39:33 compute-1 sshd-session[217240]: Disconnected from invalid user zmarin 5.198.176.28 port 45408 [preauth]
Nov 24 13:39:34 compute-1 nova_compute[187078]: 2025-11-24 13:39:34.069 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:35 compute-1 podman[197429]: time="2025-11-24T13:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:39:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:39:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Nov 24 13:39:35 compute-1 nova_compute[187078]: 2025-11-24 13:39:35.684 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:39:35 compute-1 sshd-session[217243]: Received disconnect from 68.183.82.237 port 50800:11: Bye Bye [preauth]
Nov 24 13:39:35 compute-1 sshd-session[217243]: Disconnected from authenticating user root 68.183.82.237 port 50800 [preauth]
Nov 24 13:39:38 compute-1 nova_compute[187078]: 2025-11-24 13:39:38.708 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:39 compute-1 nova_compute[187078]: 2025-11-24 13:39:39.122 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:41 compute-1 sshd-session[217247]: Connection closed by 165.232.48.44 port 54414
Nov 24 13:39:41 compute-1 podman[217248]: 2025-11-24 13:39:41.521702889 +0000 UTC m=+0.070533459 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 24 13:39:41 compute-1 sshd-session[217245]: Invalid user cgpexpert from 45.78.194.40 port 40652
Nov 24 13:39:43 compute-1 sshd-session[217270]: Invalid user sol from 45.148.10.240 port 60126
Nov 24 13:39:43 compute-1 sshd-session[217270]: Connection closed by invalid user sol 45.148.10.240 port 60126 [preauth]
Nov 24 13:39:43 compute-1 nova_compute[187078]: 2025-11-24 13:39:43.710 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:44 compute-1 nova_compute[187078]: 2025-11-24 13:39:44.122 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:47 compute-1 nova_compute[187078]: 2025-11-24 13:39:47.050 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Check if temp file /var/lib/nova/instances/tmp8sgdfqlf exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 24 13:39:47 compute-1 nova_compute[187078]: 2025-11-24 13:39:47.050 187082 DEBUG nova.compute.manager [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8sgdfqlf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='32131561-653e-44d4-9108-c4a2b0328dbd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 24 13:39:47 compute-1 nova_compute[187078]: 2025-11-24 13:39:47.697 187082 DEBUG oslo_concurrency.processutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:39:47 compute-1 nova_compute[187078]: 2025-11-24 13:39:47.751 187082 DEBUG oslo_concurrency.processutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:39:47 compute-1 nova_compute[187078]: 2025-11-24 13:39:47.752 187082 DEBUG oslo_concurrency.processutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:39:47 compute-1 nova_compute[187078]: 2025-11-24 13:39:47.814 187082 DEBUG oslo_concurrency.processutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:39:48 compute-1 nova_compute[187078]: 2025-11-24 13:39:48.712 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:49 compute-1 nova_compute[187078]: 2025-11-24 13:39:49.153 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:49 compute-1 sshd-session[217279]: Accepted publickey for nova from 192.168.122.100 port 41214 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:39:49 compute-1 openstack_network_exporter[199599]: ERROR   13:39:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:39:49 compute-1 openstack_network_exporter[199599]: ERROR   13:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:39:49 compute-1 openstack_network_exporter[199599]: ERROR   13:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:39:49 compute-1 openstack_network_exporter[199599]: ERROR   13:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:39:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:39:49 compute-1 openstack_network_exporter[199599]: ERROR   13:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:39:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:39:49 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Nov 24 13:39:49 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 24 13:39:49 compute-1 systemd-logind[815]: New session 45 of user nova.
Nov 24 13:39:49 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 24 13:39:49 compute-1 systemd[1]: Starting User Manager for UID 42436...
Nov 24 13:39:49 compute-1 systemd[217283]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:39:49 compute-1 systemd[217283]: Queued start job for default target Main User Target.
Nov 24 13:39:49 compute-1 systemd[217283]: Created slice User Application Slice.
Nov 24 13:39:49 compute-1 systemd[217283]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:39:49 compute-1 systemd[217283]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:39:49 compute-1 systemd[217283]: Reached target Paths.
Nov 24 13:39:49 compute-1 systemd[217283]: Reached target Timers.
Nov 24 13:39:49 compute-1 systemd[217283]: Starting D-Bus User Message Bus Socket...
Nov 24 13:39:49 compute-1 systemd[217283]: Starting Create User's Volatile Files and Directories...
Nov 24 13:39:49 compute-1 systemd[217283]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:39:49 compute-1 systemd[217283]: Reached target Sockets.
Nov 24 13:39:49 compute-1 systemd[217283]: Finished Create User's Volatile Files and Directories.
Nov 24 13:39:49 compute-1 systemd[217283]: Reached target Basic System.
Nov 24 13:39:49 compute-1 systemd[217283]: Reached target Main User Target.
Nov 24 13:39:49 compute-1 systemd[217283]: Startup finished in 175ms.
Nov 24 13:39:49 compute-1 systemd[1]: Started User Manager for UID 42436.
Nov 24 13:39:49 compute-1 systemd[1]: Started Session 45 of User nova.
Nov 24 13:39:49 compute-1 sshd-session[217279]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:39:49 compute-1 sshd-session[217298]: Received disconnect from 192.168.122.100 port 41214:11: disconnected by user
Nov 24 13:39:49 compute-1 sshd-session[217298]: Disconnected from user nova 192.168.122.100 port 41214
Nov 24 13:39:49 compute-1 sshd-session[217279]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:39:49 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Nov 24 13:39:49 compute-1 systemd-logind[815]: Session 45 logged out. Waiting for processes to exit.
Nov 24 13:39:49 compute-1 systemd-logind[815]: Removed session 45.
Nov 24 13:39:50 compute-1 nova_compute[187078]: 2025-11-24 13:39:50.946 187082 DEBUG nova.compute.manager [req-09c37335-3006-4337-b149-db8d4edb6858 req-a9dcaee2-c1c3-4ef8-b22a-0de6411a626b 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:50 compute-1 nova_compute[187078]: 2025-11-24 13:39:50.948 187082 DEBUG oslo_concurrency.lockutils [req-09c37335-3006-4337-b149-db8d4edb6858 req-a9dcaee2-c1c3-4ef8-b22a-0de6411a626b 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:50 compute-1 nova_compute[187078]: 2025-11-24 13:39:50.949 187082 DEBUG oslo_concurrency.lockutils [req-09c37335-3006-4337-b149-db8d4edb6858 req-a9dcaee2-c1c3-4ef8-b22a-0de6411a626b 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:50.948 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:39:50 compute-1 nova_compute[187078]: 2025-11-24 13:39:50.949 187082 DEBUG oslo_concurrency.lockutils [req-09c37335-3006-4337-b149-db8d4edb6858 req-a9dcaee2-c1c3-4ef8-b22a-0de6411a626b 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:50 compute-1 nova_compute[187078]: 2025-11-24 13:39:50.949 187082 DEBUG nova.compute.manager [req-09c37335-3006-4337-b149-db8d4edb6858 req-a9dcaee2-c1c3-4ef8-b22a-0de6411a626b 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:39:50 compute-1 nova_compute[187078]: 2025-11-24 13:39:50.949 187082 DEBUG nova.compute.manager [req-09c37335-3006-4337-b149-db8d4edb6858 req-a9dcaee2-c1c3-4ef8-b22a-0de6411a626b 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:39:50 compute-1 nova_compute[187078]: 2025-11-24 13:39:50.950 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:50 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:50.949 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:39:51 compute-1 sshd-session[217245]: Received disconnect from 45.78.194.40 port 40652:11: Bye Bye [preauth]
Nov 24 13:39:51 compute-1 sshd-session[217245]: Disconnected from invalid user cgpexpert 45.78.194.40 port 40652 [preauth]
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.786 187082 INFO nova.compute.manager [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Took 3.97 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.786 187082 DEBUG nova.compute.manager [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.803 187082 DEBUG nova.compute.manager [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8sgdfqlf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='32131561-653e-44d4-9108-c4a2b0328dbd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(95c59713-8812-4561-b673-40f44f9d694c),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.823 187082 DEBUG nova.objects.instance [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid 32131561-653e-44d4-9108-c4a2b0328dbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.824 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.826 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.826 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.843 187082 DEBUG nova.virt.libvirt.vif [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:38:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-25191881',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-25191881',id=23,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:38:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-9g80qya3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:38:50Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=32131561-653e-44d4-9108-c4a2b0328dbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.844 187082 DEBUG nova.network.os_vif_util [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.844 187082 DEBUG nova.network.os_vif_util [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:0c:06,bridge_name='br-int',has_traffic_filtering=True,id=6784f44d-c942-41a3-89a4-d1e14ae39574,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6784f44d-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.845 187082 DEBUG nova.virt.libvirt.migration [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Updating guest XML with vif config: <interface type="ethernet">
Nov 24 13:39:51 compute-1 nova_compute[187078]:   <mac address="fa:16:3e:64:0c:06"/>
Nov 24 13:39:51 compute-1 nova_compute[187078]:   <model type="virtio"/>
Nov 24 13:39:51 compute-1 nova_compute[187078]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:39:51 compute-1 nova_compute[187078]:   <mtu size="1442"/>
Nov 24 13:39:51 compute-1 nova_compute[187078]:   <target dev="tap6784f44d-c9"/>
Nov 24 13:39:51 compute-1 nova_compute[187078]: </interface>
Nov 24 13:39:51 compute-1 nova_compute[187078]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 24 13:39:51 compute-1 nova_compute[187078]: 2025-11-24 13:39:51.845 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 24 13:39:52 compute-1 nova_compute[187078]: 2025-11-24 13:39:52.328 187082 DEBUG nova.virt.libvirt.migration [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:39:52 compute-1 nova_compute[187078]: 2025-11-24 13:39:52.329 187082 INFO nova.virt.libvirt.migration [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 24 13:39:52 compute-1 nova_compute[187078]: 2025-11-24 13:39:52.383 187082 INFO nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 24 13:39:52 compute-1 nova_compute[187078]: 2025-11-24 13:39:52.886 187082 DEBUG nova.virt.libvirt.migration [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:39:52 compute-1 nova_compute[187078]: 2025-11-24 13:39:52.886 187082 DEBUG nova.virt.libvirt.migration [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.025 187082 DEBUG nova.compute.manager [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.026 187082 DEBUG oslo_concurrency.lockutils [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.026 187082 DEBUG oslo_concurrency.lockutils [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.026 187082 DEBUG oslo_concurrency.lockutils [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.027 187082 DEBUG nova.compute.manager [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.027 187082 WARNING nova.compute.manager [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received unexpected event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with vm_state active and task_state migrating.
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.027 187082 DEBUG nova.compute.manager [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-changed-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.027 187082 DEBUG nova.compute.manager [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Refreshing instance network info cache due to event network-changed-6784f44d-c942-41a3-89a4-d1e14ae39574. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.028 187082 DEBUG oslo_concurrency.lockutils [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.028 187082 DEBUG oslo_concurrency.lockutils [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.028 187082 DEBUG nova.network.neutron [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Refreshing network info cache for port 6784f44d-c942-41a3-89a4-d1e14ae39574 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.313 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991593.3130467, 32131561-653e-44d4-9108-c4a2b0328dbd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.313 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] VM Paused (Lifecycle Event)
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.330 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.335 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.356 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.390 187082 DEBUG nova.virt.libvirt.migration [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.391 187082 DEBUG nova.virt.libvirt.migration [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:39:53 compute-1 kernel: tap6784f44d-c9 (unregistering): left promiscuous mode
Nov 24 13:39:53 compute-1 NetworkManager[55527]: <info>  [1763991593.4595] device (tap6784f44d-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:39:53 compute-1 ovn_controller[95368]: 2025-11-24T13:39:53Z|00206|binding|INFO|Releasing lport 6784f44d-c942-41a3-89a4-d1e14ae39574 from this chassis (sb_readonly=0)
Nov 24 13:39:53 compute-1 ovn_controller[95368]: 2025-11-24T13:39:53Z|00207|binding|INFO|Setting lport 6784f44d-c942-41a3-89a4-d1e14ae39574 down in Southbound
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.468 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:53 compute-1 ovn_controller[95368]: 2025-11-24T13:39:53Z|00208|binding|INFO|Removing iface tap6784f44d-c9 ovn-installed in OVS
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.479 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:0c:06 10.100.0.10'], port_security=['fa:16:3e:64:0c:06 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '32131561-653e-44d4-9108-c4a2b0328dbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a66bcdc071b741ef8709a4608acd6051', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7919c96b-821d-4f19-8007-93cc72ae0ab8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7427b943-8d7d-4f9b-bcdf-8241a54887e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=6784f44d-c942-41a3-89a4-d1e14ae39574) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.481 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 6784f44d-c942-41a3-89a4-d1e14ae39574 in datapath ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 unbound from our chassis
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.482 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.484 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9f4653ae-a50e-4ec3-adf1-d38bd1560074]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.484 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 namespace which is not needed anymore
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.487 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:53 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Deactivated successfully.
Nov 24 13:39:53 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Consumed 15.594s CPU time.
Nov 24 13:39:53 compute-1 systemd-machined[153355]: Machine qemu-17-instance-00000017 terminated.
Nov 24 13:39:53 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[217003]: [NOTICE]   (217007) : haproxy version is 2.8.14-c23fe91
Nov 24 13:39:53 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[217003]: [NOTICE]   (217007) : path to executable is /usr/sbin/haproxy
Nov 24 13:39:53 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[217003]: [WARNING]  (217007) : Exiting Master process...
Nov 24 13:39:53 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[217003]: [WARNING]  (217007) : Exiting Master process...
Nov 24 13:39:53 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[217003]: [ALERT]    (217007) : Current worker (217009) exited with code 143 (Terminated)
Nov 24 13:39:53 compute-1 neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0[217003]: [WARNING]  (217007) : All workers exited. Exiting... (0)
Nov 24 13:39:53 compute-1 systemd[1]: libpod-5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6.scope: Deactivated successfully.
Nov 24 13:39:53 compute-1 podman[217335]: 2025-11-24 13:39:53.635983172 +0000 UTC m=+0.047336764 container died 5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:39:53 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6-userdata-shm.mount: Deactivated successfully.
Nov 24 13:39:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-3f3447e99e661791f07702b2df16b317ed2d9eb2019a6abfa0a1dfd53ad27ec6-merged.mount: Deactivated successfully.
Nov 24 13:39:53 compute-1 podman[217335]: 2025-11-24 13:39:53.680384676 +0000 UTC m=+0.091738268 container cleanup 5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 24 13:39:53 compute-1 systemd[1]: libpod-conmon-5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6.scope: Deactivated successfully.
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.714 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.716 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.717 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.717 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 24 13:39:53 compute-1 podman[217377]: 2025-11-24 13:39:53.769324948 +0000 UTC m=+0.058464083 container remove 5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.778 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[edc3a359-1d94-4567-aeef-ddea20e9e745]: (4, ('Mon Nov 24 01:39:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6)\n5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6\nMon Nov 24 01:39:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 (5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6)\n5c195145c5a7888fe478f06ef904d336949ccbcffcfe34cdf0163ad95a72a7f6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.780 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[cb374441-5d01-40af-b872-11fb9ebd4496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.781 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee6bf4e1-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.782 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:53 compute-1 kernel: tapee6bf4e1-a0: left promiscuous mode
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.798 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.800 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[47493308-0ee6-4acc-9fef-a5530de27d73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.814 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[dee4a0b9-6b95-4c86-94a4-f6d99cf19dd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.815 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0e70fc13-f8c0-42ca-a246-b63838da296c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.839 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd4d421-27bd-45e0-8d9d-0e9c6e7d01fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454793, 'reachable_time': 44018, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217399, 'error': None, 'target': 'ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:39:53 compute-1 systemd[1]: run-netns-ovnmeta\x2dee6bf4e1\x2dadcd\x2d4f6c\x2d8b46\x2deaa71e64e9c0.mount: Deactivated successfully.
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.845 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:39:53 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:39:53.845 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[cae773f5-7d8e-430a-95fc-1d06d8074fd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.894 187082 DEBUG nova.virt.libvirt.guest [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '32131561-653e-44d4-9108-c4a2b0328dbd' (instance-00000017) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.894 187082 INFO nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Migration operation has completed
Nov 24 13:39:53 compute-1 nova_compute[187078]: 2025-11-24 13:39:53.895 187082 INFO nova.compute.manager [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] _post_live_migration() is started..
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.155 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.509 187082 DEBUG nova.network.neutron [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Updated VIF entry in instance network info cache for port 6784f44d-c942-41a3-89a4-d1e14ae39574. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.510 187082 DEBUG nova.network.neutron [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Updating instance_info_cache with network_info: [{"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.526 187082 DEBUG oslo_concurrency.lockutils [req-f2336bfa-f1b7-48c6-8b14-2d012ce64081 req-c7d51178-f6c1-4951-9cd2-4ea5b314e6c1 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-32131561-653e-44d4-9108-c4a2b0328dbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.647 187082 DEBUG nova.network.neutron [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Activated binding for port 6784f44d-c942-41a3-89a4-d1e14ae39574 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.648 187082 DEBUG nova.compute.manager [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.649 187082 DEBUG nova.virt.libvirt.vif [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:38:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-25191881',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-25191881',id=23,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:38:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a66bcdc071b741ef8709a4608acd6051',ramdisk_id='',reservation_id='r-9g80qya3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-392394962',owner_user_name='tempest-TestExecuteStrategies-392394962-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:39:45Z,user_data=None,user_id='44609a4d2fa941a4b26d6b27a5d4a6d2',uuid=32131561-653e-44d4-9108-c4a2b0328dbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.649 187082 DEBUG nova.network.os_vif_util [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "6784f44d-c942-41a3-89a4-d1e14ae39574", "address": "fa:16:3e:64:0c:06", "network": {"id": "ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-792228601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a66bcdc071b741ef8709a4608acd6051", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6784f44d-c9", "ovs_interfaceid": "6784f44d-c942-41a3-89a4-d1e14ae39574", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.650 187082 DEBUG nova.network.os_vif_util [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:0c:06,bridge_name='br-int',has_traffic_filtering=True,id=6784f44d-c942-41a3-89a4-d1e14ae39574,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6784f44d-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.651 187082 DEBUG os_vif [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:0c:06,bridge_name='br-int',has_traffic_filtering=True,id=6784f44d-c942-41a3-89a4-d1e14ae39574,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6784f44d-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.653 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.654 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6784f44d-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.656 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.658 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.660 187082 INFO os_vif [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:0c:06,bridge_name='br-int',has_traffic_filtering=True,id=6784f44d-c942-41a3-89a4-d1e14ae39574,network=Network(ee6bf4e1-adcd-4f6c-8b46-eaa71e64e9c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6784f44d-c9')
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.660 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.660 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.660 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.661 187082 DEBUG nova.compute.manager [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.661 187082 INFO nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Deleting instance files /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd_del
Nov 24 13:39:54 compute-1 nova_compute[187078]: 2025-11-24 13:39:54.662 187082 INFO nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Deletion of /var/lib/nova/instances/32131561-653e-44d4-9108-c4a2b0328dbd_del complete
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.121 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.121 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.122 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.122 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.122 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.122 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.123 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.123 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.123 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.124 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.124 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.124 187082 WARNING nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received unexpected event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with vm_state active and task_state migrating.
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.125 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.125 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.125 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.125 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.126 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.126 187082 WARNING nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received unexpected event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with vm_state active and task_state migrating.
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.126 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.126 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.127 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.127 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.127 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.128 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-unplugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.128 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.128 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.128 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.129 187082 DEBUG oslo_concurrency.lockutils [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.129 187082 DEBUG nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:39:55 compute-1 nova_compute[187078]: 2025-11-24 13:39:55.129 187082 WARNING nova.compute.manager [req-d26aac91-c196-4bb8-bdb3-a43b7074d703 req-809e46e3-0353-430b-8829-5538cf030208 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received unexpected event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with vm_state active and task_state migrating.
Nov 24 13:39:56 compute-1 podman[217400]: 2025-11-24 13:39:56.523732328 +0000 UTC m=+0.061175016 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:39:56 compute-1 podman[217401]: 2025-11-24 13:39:56.538962158 +0000 UTC m=+0.067006174 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:39:57 compute-1 nova_compute[187078]: 2025-11-24 13:39:57.230 187082 DEBUG nova.compute.manager [req-2d2a9605-10f4-4aeb-8181-aa240aa39571 req-a82b25a3-ec56-46ae-b8f6-b4293f1650f7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:39:57 compute-1 nova_compute[187078]: 2025-11-24 13:39:57.230 187082 DEBUG oslo_concurrency.lockutils [req-2d2a9605-10f4-4aeb-8181-aa240aa39571 req-a82b25a3-ec56-46ae-b8f6-b4293f1650f7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:57 compute-1 nova_compute[187078]: 2025-11-24 13:39:57.231 187082 DEBUG oslo_concurrency.lockutils [req-2d2a9605-10f4-4aeb-8181-aa240aa39571 req-a82b25a3-ec56-46ae-b8f6-b4293f1650f7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:57 compute-1 nova_compute[187078]: 2025-11-24 13:39:57.231 187082 DEBUG oslo_concurrency.lockutils [req-2d2a9605-10f4-4aeb-8181-aa240aa39571 req-a82b25a3-ec56-46ae-b8f6-b4293f1650f7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:57 compute-1 nova_compute[187078]: 2025-11-24 13:39:57.231 187082 DEBUG nova.compute.manager [req-2d2a9605-10f4-4aeb-8181-aa240aa39571 req-a82b25a3-ec56-46ae-b8f6-b4293f1650f7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] No waiting events found dispatching network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:39:57 compute-1 nova_compute[187078]: 2025-11-24 13:39:57.231 187082 WARNING nova.compute.manager [req-2d2a9605-10f4-4aeb-8181-aa240aa39571 req-a82b25a3-ec56-46ae-b8f6-b4293f1650f7 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Received unexpected event network-vif-plugged-6784f44d-c942-41a3-89a4-d1e14ae39574 for instance with vm_state active and task_state migrating.
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.156 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.656 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.725 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.726 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.726 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "32131561-653e-44d4-9108-c4a2b0328dbd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.749 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.750 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.750 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.751 187082 DEBUG nova.compute.resource_tracker [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:39:59 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Nov 24 13:39:59 compute-1 systemd[217283]: Activating special unit Exit the Session...
Nov 24 13:39:59 compute-1 systemd[217283]: Stopped target Main User Target.
Nov 24 13:39:59 compute-1 systemd[217283]: Stopped target Basic System.
Nov 24 13:39:59 compute-1 systemd[217283]: Stopped target Paths.
Nov 24 13:39:59 compute-1 systemd[217283]: Stopped target Sockets.
Nov 24 13:39:59 compute-1 systemd[217283]: Stopped target Timers.
Nov 24 13:39:59 compute-1 systemd[217283]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:39:59 compute-1 systemd[217283]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:39:59 compute-1 systemd[217283]: Closed D-Bus User Message Bus Socket.
Nov 24 13:39:59 compute-1 systemd[217283]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:39:59 compute-1 systemd[217283]: Removed slice User Application Slice.
Nov 24 13:39:59 compute-1 systemd[217283]: Reached target Shutdown.
Nov 24 13:39:59 compute-1 systemd[217283]: Finished Exit the Session.
Nov 24 13:39:59 compute-1 systemd[217283]: Reached target Exit the Session.
Nov 24 13:39:59 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Nov 24 13:39:59 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Nov 24 13:39:59 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 24 13:39:59 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 24 13:39:59 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 24 13:39:59 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.978 187082 WARNING nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.980 187082 DEBUG nova.compute.resource_tracker [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5871MB free_disk=73.45927429199219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.981 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:39:59 compute-1 nova_compute[187078]: 2025-11-24 13:39:59.981 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:39:59 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.025 187082 DEBUG nova.compute.resource_tracker [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration for instance 32131561-653e-44d4-9108-c4a2b0328dbd refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.042 187082 DEBUG nova.compute.resource_tracker [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.073 187082 DEBUG nova.compute.resource_tracker [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration 95c59713-8812-4561-b673-40f44f9d694c is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.074 187082 DEBUG nova.compute.resource_tracker [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.075 187082 DEBUG nova.compute.resource_tracker [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.131 187082 DEBUG nova.compute.provider_tree [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.145 187082 DEBUG nova.scheduler.client.report [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.166 187082 DEBUG nova.compute.resource_tracker [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.167 187082 DEBUG oslo_concurrency.lockutils [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.175 187082 INFO nova.compute.manager [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.267 187082 INFO nova.scheduler.client.report [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration 95c59713-8812-4561-b673-40f44f9d694c
Nov 24 13:40:00 compute-1 nova_compute[187078]: 2025-11-24 13:40:00.268 187082 DEBUG nova.virt.libvirt.driver [None req-7ca63ebf-7ce3-4953-b2d6-cda779644ef8 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 24 13:40:00 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:40:00.952 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:40:02 compute-1 podman[217446]: 2025-11-24 13:40:02.519876564 +0000 UTC m=+0.066845549 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 13:40:02 compute-1 podman[217447]: 2025-11-24 13:40:02.555658197 +0000 UTC m=+0.091048891 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:40:04 compute-1 nova_compute[187078]: 2025-11-24 13:40:04.158 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:40:04.168 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:40:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:40:04.169 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:40:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:40:04.169 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:40:04 compute-1 nova_compute[187078]: 2025-11-24 13:40:04.658 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:05 compute-1 podman[197429]: time="2025-11-24T13:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:40:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:40:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Nov 24 13:40:08 compute-1 nova_compute[187078]: 2025-11-24 13:40:08.716 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991593.7137892, 32131561-653e-44d4-9108-c4a2b0328dbd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:40:08 compute-1 nova_compute[187078]: 2025-11-24 13:40:08.716 187082 INFO nova.compute.manager [-] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] VM Stopped (Lifecycle Event)
Nov 24 13:40:08 compute-1 nova_compute[187078]: 2025-11-24 13:40:08.732 187082 DEBUG nova.compute.manager [None req-4804fcac-2a84-40cb-ab74-2109e6a783ce - - - - - -] [instance: 32131561-653e-44d4-9108-c4a2b0328dbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:40:09 compute-1 nova_compute[187078]: 2025-11-24 13:40:09.160 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:09 compute-1 nova_compute[187078]: 2025-11-24 13:40:09.443 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:09 compute-1 nova_compute[187078]: 2025-11-24 13:40:09.660 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:12 compute-1 podman[217490]: 2025-11-24 13:40:12.553082499 +0000 UTC m=+0.078514693 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Nov 24 13:40:14 compute-1 nova_compute[187078]: 2025-11-24 13:40:14.162 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:14 compute-1 nova_compute[187078]: 2025-11-24 13:40:14.661 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:16 compute-1 nova_compute[187078]: 2025-11-24 13:40:16.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:19 compute-1 nova_compute[187078]: 2025-11-24 13:40:19.188 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:19 compute-1 openstack_network_exporter[199599]: ERROR   13:40:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:40:19 compute-1 openstack_network_exporter[199599]: ERROR   13:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:40:19 compute-1 openstack_network_exporter[199599]: ERROR   13:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:40:19 compute-1 openstack_network_exporter[199599]: ERROR   13:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:40:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:40:19 compute-1 openstack_network_exporter[199599]: ERROR   13:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:40:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:40:19 compute-1 nova_compute[187078]: 2025-11-24 13:40:19.663 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:19 compute-1 nova_compute[187078]: 2025-11-24 13:40:19.678 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:19 compute-1 nova_compute[187078]: 2025-11-24 13:40:19.679 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:22 compute-1 nova_compute[187078]: 2025-11-24 13:40:22.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:23 compute-1 nova_compute[187078]: 2025-11-24 13:40:23.428 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:23 compute-1 nova_compute[187078]: 2025-11-24 13:40:23.684 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:23 compute-1 nova_compute[187078]: 2025-11-24 13:40:23.685 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:24 compute-1 nova_compute[187078]: 2025-11-24 13:40:24.189 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:24 compute-1 nova_compute[187078]: 2025-11-24 13:40:24.665 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.685 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.686 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.686 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.686 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.829 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.830 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5882MB free_disk=73.45927429199219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.830 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.831 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.957 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:40:25 compute-1 nova_compute[187078]: 2025-11-24 13:40:25.957 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:40:26 compute-1 nova_compute[187078]: 2025-11-24 13:40:26.047 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:40:26 compute-1 nova_compute[187078]: 2025-11-24 13:40:26.062 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:40:26 compute-1 nova_compute[187078]: 2025-11-24 13:40:26.063 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:40:26 compute-1 nova_compute[187078]: 2025-11-24 13:40:26.063 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:40:27 compute-1 podman[217513]: 2025-11-24 13:40:27.508971149 +0000 UTC m=+0.054552477 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:40:27 compute-1 podman[217512]: 2025-11-24 13:40:27.529088771 +0000 UTC m=+0.068827192 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:40:29 compute-1 nova_compute[187078]: 2025-11-24 13:40:29.064 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:29 compute-1 nova_compute[187078]: 2025-11-24 13:40:29.064 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:40:29 compute-1 nova_compute[187078]: 2025-11-24 13:40:29.064 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:40:29 compute-1 nova_compute[187078]: 2025-11-24 13:40:29.084 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:40:29 compute-1 nova_compute[187078]: 2025-11-24 13:40:29.084 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:29 compute-1 nova_compute[187078]: 2025-11-24 13:40:29.084 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:40:29 compute-1 nova_compute[187078]: 2025-11-24 13:40:29.240 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:29 compute-1 nova_compute[187078]: 2025-11-24 13:40:29.667 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:31 compute-1 nova_compute[187078]: 2025-11-24 13:40:31.680 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:33 compute-1 podman[217555]: 2025-11-24 13:40:33.547214179 +0000 UTC m=+0.089612291 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 24 13:40:33 compute-1 podman[217556]: 2025-11-24 13:40:33.58143794 +0000 UTC m=+0.124039607 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:40:33 compute-1 nova_compute[187078]: 2025-11-24 13:40:33.678 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:34 compute-1 nova_compute[187078]: 2025-11-24 13:40:34.241 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:34 compute-1 nova_compute[187078]: 2025-11-24 13:40:34.669 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:35 compute-1 podman[197429]: time="2025-11-24T13:40:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:40:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:40:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:40:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:40:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 24 13:40:36 compute-1 ovn_controller[95368]: 2025-11-24T13:40:36Z|00209|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 24 13:40:39 compute-1 nova_compute[187078]: 2025-11-24 13:40:39.259 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:39 compute-1 nova_compute[187078]: 2025-11-24 13:40:39.671 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:39 compute-1 sshd-session[217601]: Invalid user teamspeak from 5.198.176.28 port 45520
Nov 24 13:40:40 compute-1 sshd-session[217601]: Received disconnect from 5.198.176.28 port 45520:11: Bye Bye [preauth]
Nov 24 13:40:40 compute-1 sshd-session[217601]: Disconnected from invalid user teamspeak 5.198.176.28 port 45520 [preauth]
Nov 24 13:40:42 compute-1 nova_compute[187078]: 2025-11-24 13:40:42.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:42 compute-1 nova_compute[187078]: 2025-11-24 13:40:42.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:40:42 compute-1 nova_compute[187078]: 2025-11-24 13:40:42.693 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:40:42 compute-1 nova_compute[187078]: 2025-11-24 13:40:42.893 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:43 compute-1 podman[217603]: 2025-11-24 13:40:43.531684094 +0000 UTC m=+0.078190584 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Nov 24 13:40:43 compute-1 nova_compute[187078]: 2025-11-24 13:40:43.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:40:43 compute-1 nova_compute[187078]: 2025-11-24 13:40:43.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:40:44 compute-1 nova_compute[187078]: 2025-11-24 13:40:44.260 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:44 compute-1 nova_compute[187078]: 2025-11-24 13:40:44.673 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:49 compute-1 nova_compute[187078]: 2025-11-24 13:40:49.302 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:49 compute-1 openstack_network_exporter[199599]: ERROR   13:40:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:40:49 compute-1 openstack_network_exporter[199599]: ERROR   13:40:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:40:49 compute-1 openstack_network_exporter[199599]: ERROR   13:40:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:40:49 compute-1 openstack_network_exporter[199599]: ERROR   13:40:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:40:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:40:49 compute-1 openstack_network_exporter[199599]: ERROR   13:40:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:40:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:40:49 compute-1 nova_compute[187078]: 2025-11-24 13:40:49.675 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:51 compute-1 sshd-session[217625]: Received disconnect from 68.183.82.237 port 53810:11: Bye Bye [preauth]
Nov 24 13:40:51 compute-1 sshd-session[217625]: Disconnected from authenticating user root 68.183.82.237 port 53810 [preauth]
Nov 24 13:40:51 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:40:51.741 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:40:51 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:40:51.742 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:40:51 compute-1 nova_compute[187078]: 2025-11-24 13:40:51.792 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:54 compute-1 nova_compute[187078]: 2025-11-24 13:40:54.351 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:54 compute-1 nova_compute[187078]: 2025-11-24 13:40:54.677 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:58 compute-1 podman[217628]: 2025-11-24 13:40:58.564478484 +0000 UTC m=+0.087112524 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:40:58 compute-1 podman[217627]: 2025-11-24 13:40:58.582244392 +0000 UTC m=+0.103587827 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:40:58 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:40:58.745 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:40:59 compute-1 nova_compute[187078]: 2025-11-24 13:40:59.353 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:40:59 compute-1 nova_compute[187078]: 2025-11-24 13:40:59.679 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:04.169 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:04.170 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:04.170 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:04 compute-1 nova_compute[187078]: 2025-11-24 13:41:04.361 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:04 compute-1 podman[217672]: 2025-11-24 13:41:04.552722089 +0000 UTC m=+0.096141547 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 24 13:41:04 compute-1 podman[217673]: 2025-11-24 13:41:04.597152554 +0000 UTC m=+0.130767748 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:41:04 compute-1 nova_compute[187078]: 2025-11-24 13:41:04.680 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:05 compute-1 podman[197429]: time="2025-11-24T13:41:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:41:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:41:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:41:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:41:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 24 13:41:06 compute-1 sshd-session[217719]: Received disconnect from 175.100.24.139 port 48316:11: Bye Bye [preauth]
Nov 24 13:41:06 compute-1 sshd-session[217719]: Disconnected from authenticating user root 175.100.24.139 port 48316 [preauth]
Nov 24 13:41:09 compute-1 nova_compute[187078]: 2025-11-24 13:41:09.368 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:09 compute-1 nova_compute[187078]: 2025-11-24 13:41:09.682 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:12 compute-1 sshd-session[217721]: Invalid user solana from 193.32.162.146 port 34510
Nov 24 13:41:12 compute-1 sshd-session[217721]: Connection closed by invalid user solana 193.32.162.146 port 34510 [preauth]
Nov 24 13:41:14 compute-1 nova_compute[187078]: 2025-11-24 13:41:14.370 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:14 compute-1 podman[217723]: 2025-11-24 13:41:14.533036241 +0000 UTC m=+0.084522054 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 24 13:41:14 compute-1 nova_compute[187078]: 2025-11-24 13:41:14.685 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:19 compute-1 nova_compute[187078]: 2025-11-24 13:41:19.402 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:19 compute-1 openstack_network_exporter[199599]: ERROR   13:41:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:41:19 compute-1 openstack_network_exporter[199599]: ERROR   13:41:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:41:19 compute-1 openstack_network_exporter[199599]: ERROR   13:41:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:41:19 compute-1 openstack_network_exporter[199599]: ERROR   13:41:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:41:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:41:19 compute-1 openstack_network_exporter[199599]: ERROR   13:41:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:41:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:41:19 compute-1 nova_compute[187078]: 2025-11-24 13:41:19.687 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:20 compute-1 ovn_controller[95368]: 2025-11-24T13:41:20Z|00210|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 24 13:41:20 compute-1 nova_compute[187078]: 2025-11-24 13:41:20.679 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:21 compute-1 nova_compute[187078]: 2025-11-24 13:41:21.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:23 compute-1 nova_compute[187078]: 2025-11-24 13:41:23.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:24 compute-1 nova_compute[187078]: 2025-11-24 13:41:24.403 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:24 compute-1 nova_compute[187078]: 2025-11-24 13:41:24.689 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:25 compute-1 nova_compute[187078]: 2025-11-24 13:41:25.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:25 compute-1 nova_compute[187078]: 2025-11-24 13:41:25.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.692 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.924 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.925 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5890MB free_disk=73.45927429199219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.925 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.925 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.977 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:41:26 compute-1 nova_compute[187078]: 2025-11-24 13:41:26.978 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:41:27 compute-1 nova_compute[187078]: 2025-11-24 13:41:27.051 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:41:27 compute-1 nova_compute[187078]: 2025-11-24 13:41:27.063 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:41:27 compute-1 nova_compute[187078]: 2025-11-24 13:41:27.066 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:41:27 compute-1 nova_compute[187078]: 2025-11-24 13:41:27.067 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.067 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.068 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.406 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:29 compute-1 podman[217744]: 2025-11-24 13:41:29.515745454 +0000 UTC m=+0.058491755 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:41:29 compute-1 podman[217745]: 2025-11-24 13:41:29.525570457 +0000 UTC m=+0.062819800 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 24 13:41:29 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:29.537 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:41:29 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:29.538 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.539 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.677 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:41:29 compute-1 nova_compute[187078]: 2025-11-24 13:41:29.691 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:34 compute-1 nova_compute[187078]: 2025-11-24 13:41:34.408 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:34 compute-1 nova_compute[187078]: 2025-11-24 13:41:34.670 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:41:34 compute-1 nova_compute[187078]: 2025-11-24 13:41:34.693 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:35 compute-1 podman[217789]: 2025-11-24 13:41:35.518798006 +0000 UTC m=+0.061287869 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 24 13:41:35 compute-1 podman[217790]: 2025-11-24 13:41:35.594749428 +0000 UTC m=+0.135393552 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:41:35 compute-1 podman[197429]: time="2025-11-24T13:41:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:41:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:41:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:41:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:41:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 24 13:41:36 compute-1 nova_compute[187078]: 2025-11-24 13:41:36.821 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:36 compute-1 nova_compute[187078]: 2025-11-24 13:41:36.821 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:36 compute-1 nova_compute[187078]: 2025-11-24 13:41:36.836 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:41:36 compute-1 nova_compute[187078]: 2025-11-24 13:41:36.928 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:36 compute-1 nova_compute[187078]: 2025-11-24 13:41:36.929 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:36 compute-1 nova_compute[187078]: 2025-11-24 13:41:36.937 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:41:36 compute-1 nova_compute[187078]: 2025-11-24 13:41:36.937 187082 INFO nova.compute.claims [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.110 187082 DEBUG nova.compute.provider_tree [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.127 187082 DEBUG nova.scheduler.client.report [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.156 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.157 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.208 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.209 187082 DEBUG nova.network.neutron [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.227 187082 INFO nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.242 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.349 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.351 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.352 187082 INFO nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Creating image(s)
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.353 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Acquiring lock "/var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.353 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "/var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.355 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "/var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.380 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.478 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.479 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.480 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.492 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.547 187082 DEBUG nova.policy [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6c2c4276f4184eee9cbbb3ab03c0b339', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c88c7429e05d451c84ded4ece852815d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.558 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.558 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.602 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.603 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.603 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.676 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.677 187082 DEBUG nova.virt.disk.api [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Checking if we can resize image /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.678 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.740 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.742 187082 DEBUG nova.virt.disk.api [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Cannot resize image /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.742 187082 DEBUG nova.objects.instance [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lazy-loading 'migration_context' on Instance uuid 1c31b98e-6492-42f3-9dff-13cbc23050de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.760 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.761 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Ensure instance console log exists: /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.761 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.762 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.762 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:37 compute-1 nova_compute[187078]: 2025-11-24 13:41:37.976 187082 DEBUG nova.network.neutron [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Successfully created port: 6e12565f-48b1-4286-a5de-b083a7a7691d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:41:38 compute-1 nova_compute[187078]: 2025-11-24 13:41:38.749 187082 DEBUG nova.network.neutron [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Successfully updated port: 6e12565f-48b1-4286-a5de-b083a7a7691d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:41:38 compute-1 nova_compute[187078]: 2025-11-24 13:41:38.763 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Acquiring lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:41:38 compute-1 nova_compute[187078]: 2025-11-24 13:41:38.763 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Acquired lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:41:38 compute-1 nova_compute[187078]: 2025-11-24 13:41:38.763 187082 DEBUG nova.network.neutron [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:41:38 compute-1 nova_compute[187078]: 2025-11-24 13:41:38.822 187082 DEBUG nova.compute.manager [req-05079329-6966-446b-a573-a8b72d14ab1e req-50f78efe-1f95-4412-865d-0f0b4590b2c5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-changed-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:41:38 compute-1 nova_compute[187078]: 2025-11-24 13:41:38.822 187082 DEBUG nova.compute.manager [req-05079329-6966-446b-a573-a8b72d14ab1e req-50f78efe-1f95-4412-865d-0f0b4590b2c5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Refreshing instance network info cache due to event network-changed-6e12565f-48b1-4286-a5de-b083a7a7691d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:41:38 compute-1 nova_compute[187078]: 2025-11-24 13:41:38.823 187082 DEBUG oslo_concurrency.lockutils [req-05079329-6966-446b-a573-a8b72d14ab1e req-50f78efe-1f95-4412-865d-0f0b4590b2c5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:41:38 compute-1 nova_compute[187078]: 2025-11-24 13:41:38.883 187082 DEBUG nova.network.neutron [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.411 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:39 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:39.540 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.543 187082 DEBUG nova.network.neutron [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Updating instance_info_cache with network_info: [{"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.556 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Releasing lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.556 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Instance network_info: |[{"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.557 187082 DEBUG oslo_concurrency.lockutils [req-05079329-6966-446b-a573-a8b72d14ab1e req-50f78efe-1f95-4412-865d-0f0b4590b2c5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.557 187082 DEBUG nova.network.neutron [req-05079329-6966-446b-a573-a8b72d14ab1e req-50f78efe-1f95-4412-865d-0f0b4590b2c5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Refreshing network info cache for port 6e12565f-48b1-4286-a5de-b083a7a7691d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.559 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Start _get_guest_xml network_info=[{"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.564 187082 WARNING nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.571 187082 DEBUG nova.virt.libvirt.host [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.572 187082 DEBUG nova.virt.libvirt.host [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.576 187082 DEBUG nova.virt.libvirt.host [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.577 187082 DEBUG nova.virt.libvirt.host [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.578 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.578 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.578 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.578 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.579 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.579 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.579 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.579 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.579 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.580 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.580 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.580 187082 DEBUG nova.virt.hardware [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.583 187082 DEBUG nova.virt.libvirt.vif [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:41:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1794161449',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1794161449',id=26,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c88c7429e05d451c84ded4ece852815d',ramdisk_id='',reservation_id='r-t930adz6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:41:37Z,user_data=None,user_id='6c2c4276f4184eee9cbbb3ab03c0b339',uuid=1c31b98e-6492-42f3-9dff-13cbc23050de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.583 187082 DEBUG nova.network.os_vif_util [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Converting VIF {"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.584 187082 DEBUG nova.network.os_vif_util [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:fe:d4,bridge_name='br-int',has_traffic_filtering=True,id=6e12565f-48b1-4286-a5de-b083a7a7691d,network=Network(3c401877-c5ee-45c7-b1e0-8a6c54929af1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e12565f-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.585 187082 DEBUG nova.objects.instance [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c31b98e-6492-42f3-9dff-13cbc23050de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.596 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <uuid>1c31b98e-6492-42f3-9dff-13cbc23050de</uuid>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <name>instance-0000001a</name>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1794161449</nova:name>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:41:39</nova:creationTime>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:41:39 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:41:39 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:41:39 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:41:39 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:41:39 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:41:39 compute-1 nova_compute[187078]:         <nova:user uuid="6c2c4276f4184eee9cbbb3ab03c0b339">tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928-project-member</nova:user>
Nov 24 13:41:39 compute-1 nova_compute[187078]:         <nova:project uuid="c88c7429e05d451c84ded4ece852815d">tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928</nova:project>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:41:39 compute-1 nova_compute[187078]:         <nova:port uuid="6e12565f-48b1-4286-a5de-b083a7a7691d">
Nov 24 13:41:39 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <system>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <entry name="serial">1c31b98e-6492-42f3-9dff-13cbc23050de</entry>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <entry name="uuid">1c31b98e-6492-42f3-9dff-13cbc23050de</entry>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </system>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <os>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   </os>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <features>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   </features>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk.config"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:5b:fe:d4"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <target dev="tap6e12565f-48"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/console.log" append="off"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <video>
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </video>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:41:39 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:41:39 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:41:39 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:41:39 compute-1 nova_compute[187078]: </domain>
Nov 24 13:41:39 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.598 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Preparing to wait for external event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.599 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.599 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.600 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.601 187082 DEBUG nova.virt.libvirt.vif [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:41:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1794161449',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1794161449',id=26,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c88c7429e05d451c84ded4ece852815d',ramdisk_id='',reservation_id='r-t930adz6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:41:37Z,user_data=None,user_id='6c2c4276f4184eee9cbbb3ab03c0b339',uuid=1c31b98e-6492-42f3-9dff-13cbc23050de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.602 187082 DEBUG nova.network.os_vif_util [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Converting VIF {"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.603 187082 DEBUG nova.network.os_vif_util [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:fe:d4,bridge_name='br-int',has_traffic_filtering=True,id=6e12565f-48b1-4286-a5de-b083a7a7691d,network=Network(3c401877-c5ee-45c7-b1e0-8a6c54929af1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e12565f-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.604 187082 DEBUG os_vif [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:fe:d4,bridge_name='br-int',has_traffic_filtering=True,id=6e12565f-48b1-4286-a5de-b083a7a7691d,network=Network(3c401877-c5ee-45c7-b1e0-8a6c54929af1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e12565f-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.605 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.605 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.606 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.610 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.611 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e12565f-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.612 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e12565f-48, col_values=(('external_ids', {'iface-id': '6e12565f-48b1-4286-a5de-b083a7a7691d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:fe:d4', 'vm-uuid': '1c31b98e-6492-42f3-9dff-13cbc23050de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.614 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:39 compute-1 NetworkManager[55527]: <info>  [1763991699.6148] manager: (tap6e12565f-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.616 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.621 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.623 187082 INFO os_vif [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:fe:d4,bridge_name='br-int',has_traffic_filtering=True,id=6e12565f-48b1-4286-a5de-b083a7a7691d,network=Network(3c401877-c5ee-45c7-b1e0-8a6c54929af1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e12565f-48')
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.670 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.671 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.671 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] No VIF found with MAC fa:16:3e:5b:fe:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:41:39 compute-1 nova_compute[187078]: 2025-11-24 13:41:39.671 187082 INFO nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Using config drive
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.225 187082 INFO nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Creating config drive at /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk.config
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.230 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnsnv06am execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.356 187082 DEBUG oslo_concurrency.processutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnsnv06am" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:41:40 compute-1 kernel: tap6e12565f-48: entered promiscuous mode
Nov 24 13:41:40 compute-1 NetworkManager[55527]: <info>  [1763991700.4150] manager: (tap6e12565f-48): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Nov 24 13:41:40 compute-1 ovn_controller[95368]: 2025-11-24T13:41:40Z|00211|binding|INFO|Claiming lport 6e12565f-48b1-4286-a5de-b083a7a7691d for this chassis.
Nov 24 13:41:40 compute-1 ovn_controller[95368]: 2025-11-24T13:41:40Z|00212|binding|INFO|6e12565f-48b1-4286-a5de-b083a7a7691d: Claiming fa:16:3e:5b:fe:d4 10.100.0.14
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.417 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:40 compute-1 systemd-udevd[217866]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.451 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:fe:d4 10.100.0.14'], port_security=['fa:16:3e:5b:fe:d4 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1c31b98e-6492-42f3-9dff-13cbc23050de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c401877-c5ee-45c7-b1e0-8a6c54929af1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c88c7429e05d451c84ded4ece852815d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fa6db478-7702-44a4-bd64-d0b43349d3b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50d784e-9814-4b17-8c71-48498f64d246, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=6e12565f-48b1-4286-a5de-b083a7a7691d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.453 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 6e12565f-48b1-4286-a5de-b083a7a7691d in datapath 3c401877-c5ee-45c7-b1e0-8a6c54929af1 bound to our chassis
Nov 24 13:41:40 compute-1 NetworkManager[55527]: <info>  [1763991700.4556] device (tap6e12565f-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:41:40 compute-1 NetworkManager[55527]: <info>  [1763991700.4564] device (tap6e12565f-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.456 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c401877-c5ee-45c7-b1e0-8a6c54929af1
Nov 24 13:41:40 compute-1 systemd-machined[153355]: New machine qemu-18-instance-0000001a.
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.474 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e410f834-b796-4f26-b201-9285696dc696]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.476 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c401877-c1 in ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.478 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c401877-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.478 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[470fe186-aff3-4db8-9a40-4b6d362ec449]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.478 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[1839e9ba-e17d-475c-b8da-069b48da36f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.489 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.493 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[265f1063-1a12-45c3-ac44-21824617c549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.496 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:40 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-0000001a.
Nov 24 13:41:40 compute-1 ovn_controller[95368]: 2025-11-24T13:41:40Z|00213|binding|INFO|Setting lport 6e12565f-48b1-4286-a5de-b083a7a7691d ovn-installed in OVS
Nov 24 13:41:40 compute-1 ovn_controller[95368]: 2025-11-24T13:41:40Z|00214|binding|INFO|Setting lport 6e12565f-48b1-4286-a5de-b083a7a7691d up in Southbound
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.500 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.519 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[59d826d1-13bd-43d5-8d4b-64cca223d28c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.549 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[39206539-05eb-4ede-a566-95b2a54236a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 NetworkManager[55527]: <info>  [1763991700.5558] manager: (tap3c401877-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Nov 24 13:41:40 compute-1 systemd-udevd[217869]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.554 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[890c1c35-0e62-4bb9-ab1a-91538d281783]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.586 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[39808797-c381-47eb-88b0-9720c1a1c03a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.589 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[c74723e6-2ae2-4eb6-9ebc-06ee1e1e1ae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 NetworkManager[55527]: <info>  [1763991700.6087] device (tap3c401877-c0): carrier: link connected
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.613 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[a08780a3-4554-4ea3-8e07-8e73136708af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.629 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4daf4e5a-19da-4bf2-a421-da3f44fa12b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c401877-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:9f:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471926, 'reachable_time': 35685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217902, 'error': None, 'target': 'ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.643 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d2672a-62d0-449c-bb4f-3b0b230a211b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:9fc6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 471926, 'tstamp': 471926}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217903, 'error': None, 'target': 'ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.658 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[c82c66a2-d5dc-4b3e-8555-c4dd1974bb94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c401877-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:9f:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471926, 'reachable_time': 35685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217904, 'error': None, 'target': 'ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.684 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ada7fb-3c34-4f43-baea-b6caef9595e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.738 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9d10bfa7-79e9-4b8e-a2fd-abde5042f482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.739 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c401877-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.740 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.740 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c401877-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.741 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:40 compute-1 NetworkManager[55527]: <info>  [1763991700.7426] manager: (tap3c401877-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 24 13:41:40 compute-1 kernel: tap3c401877-c0: entered promiscuous mode
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.745 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.746 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c401877-c0, col_values=(('external_ids', {'iface-id': 'a11aa1b7-0c96-47df-b613-9afc903220f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.747 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:40 compute-1 ovn_controller[95368]: 2025-11-24T13:41:40Z|00215|binding|INFO|Releasing lport a11aa1b7-0c96-47df-b613-9afc903220f4 from this chassis (sb_readonly=0)
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.758 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.759 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c401877-c5ee-45c7-b1e0-8a6c54929af1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c401877-c5ee-45c7-b1e0-8a6c54929af1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.760 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4b75ae50-9c39-4b85-a699-5ded9b3823d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.761 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-3c401877-c5ee-45c7-b1e0-8a6c54929af1
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/3c401877-c5ee-45c7-b1e0-8a6c54929af1.pid.haproxy
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID 3c401877-c5ee-45c7-b1e0-8a6c54929af1
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:41:40 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:41:40.762 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1', 'env', 'PROCESS_TAG=haproxy-3c401877-c5ee-45c7-b1e0-8a6c54929af1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c401877-c5ee-45c7-b1e0-8a6c54929af1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.996 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991700.9955542, 1c31b98e-6492-42f3-9dff-13cbc23050de => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:41:40 compute-1 nova_compute[187078]: 2025-11-24 13:41:40.996 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] VM Started (Lifecycle Event)
Nov 24 13:41:41 compute-1 nova_compute[187078]: 2025-11-24 13:41:41.013 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:41:41 compute-1 nova_compute[187078]: 2025-11-24 13:41:41.018 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991700.9956815, 1c31b98e-6492-42f3-9dff-13cbc23050de => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:41:41 compute-1 nova_compute[187078]: 2025-11-24 13:41:41.018 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] VM Paused (Lifecycle Event)
Nov 24 13:41:41 compute-1 nova_compute[187078]: 2025-11-24 13:41:41.040 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:41:41 compute-1 nova_compute[187078]: 2025-11-24 13:41:41.044 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:41:41 compute-1 nova_compute[187078]: 2025-11-24 13:41:41.063 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:41:41 compute-1 podman[217942]: 2025-11-24 13:41:41.118497311 +0000 UTC m=+0.052049331 container create f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:41:41 compute-1 systemd[1]: Started libpod-conmon-f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8.scope.
Nov 24 13:41:41 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:41:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e7da0b2678acb9c1192aae54868f7bb0480af3568bab1d6df85096f92d7e11d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:41:41 compute-1 podman[217942]: 2025-11-24 13:41:41.094194827 +0000 UTC m=+0.027746877 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:41:41 compute-1 podman[217942]: 2025-11-24 13:41:41.196723684 +0000 UTC m=+0.130275704 container init f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:41:41 compute-1 podman[217942]: 2025-11-24 13:41:41.201677168 +0000 UTC m=+0.135229188 container start f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:41:41 compute-1 neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1[217957]: [NOTICE]   (217961) : New worker (217963) forked
Nov 24 13:41:41 compute-1 neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1[217957]: [NOTICE]   (217961) : Loading success.
Nov 24 13:41:42 compute-1 nova_compute[187078]: 2025-11-24 13:41:42.592 187082 DEBUG nova.network.neutron [req-05079329-6966-446b-a573-a8b72d14ab1e req-50f78efe-1f95-4412-865d-0f0b4590b2c5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Updated VIF entry in instance network info cache for port 6e12565f-48b1-4286-a5de-b083a7a7691d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:41:42 compute-1 nova_compute[187078]: 2025-11-24 13:41:42.593 187082 DEBUG nova.network.neutron [req-05079329-6966-446b-a573-a8b72d14ab1e req-50f78efe-1f95-4412-865d-0f0b4590b2c5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Updating instance_info_cache with network_info: [{"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:41:42 compute-1 nova_compute[187078]: 2025-11-24 13:41:42.605 187082 DEBUG oslo_concurrency.lockutils [req-05079329-6966-446b-a573-a8b72d14ab1e req-50f78efe-1f95-4412-865d-0f0b4590b2c5 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:41:42 compute-1 sshd-session[217972]: Invalid user sol from 45.148.10.240 port 53712
Nov 24 13:41:43 compute-1 sshd-session[217972]: Connection closed by invalid user sol 45.148.10.240 port 53712 [preauth]
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.413 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.614 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.644 187082 DEBUG nova.compute.manager [req-007c0f1f-8454-469f-ac91-f9cae415a519 req-557a459f-74b1-412b-9cd7-8a8e99a163a6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.644 187082 DEBUG oslo_concurrency.lockutils [req-007c0f1f-8454-469f-ac91-f9cae415a519 req-557a459f-74b1-412b-9cd7-8a8e99a163a6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.645 187082 DEBUG oslo_concurrency.lockutils [req-007c0f1f-8454-469f-ac91-f9cae415a519 req-557a459f-74b1-412b-9cd7-8a8e99a163a6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.645 187082 DEBUG oslo_concurrency.lockutils [req-007c0f1f-8454-469f-ac91-f9cae415a519 req-557a459f-74b1-412b-9cd7-8a8e99a163a6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.645 187082 DEBUG nova.compute.manager [req-007c0f1f-8454-469f-ac91-f9cae415a519 req-557a459f-74b1-412b-9cd7-8a8e99a163a6 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Processing event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.645 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.653 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991704.6525161, 1c31b98e-6492-42f3-9dff-13cbc23050de => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.653 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] VM Resumed (Lifecycle Event)
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.661 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.666 187082 INFO nova.virt.libvirt.driver [-] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Instance spawned successfully.
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.668 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.683 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.690 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.693 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.693 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.694 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.694 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.694 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.695 187082 DEBUG nova.virt.libvirt.driver [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.716 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.746 187082 INFO nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Took 7.40 seconds to spawn the instance on the hypervisor.
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.747 187082 DEBUG nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.801 187082 INFO nova.compute.manager [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Took 7.92 seconds to build instance.
Nov 24 13:41:44 compute-1 nova_compute[187078]: 2025-11-24 13:41:44.813 187082 DEBUG oslo_concurrency.lockutils [None req-5224a9a8-4137-4523-9ce0-8bca7c58dd2c 6c2c4276f4184eee9cbbb3ab03c0b339 c88c7429e05d451c84ded4ece852815d - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:45 compute-1 podman[217974]: 2025-11-24 13:41:45.512740195 +0000 UTC m=+0.052136333 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:41:46 compute-1 nova_compute[187078]: 2025-11-24 13:41:46.726 187082 DEBUG nova.compute.manager [req-50699f49-9d43-448f-b350-a7c575bed428 req-f5b18833-c6c7-4a00-a81b-b5ac7023554a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:41:46 compute-1 nova_compute[187078]: 2025-11-24 13:41:46.727 187082 DEBUG oslo_concurrency.lockutils [req-50699f49-9d43-448f-b350-a7c575bed428 req-f5b18833-c6c7-4a00-a81b-b5ac7023554a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:41:46 compute-1 nova_compute[187078]: 2025-11-24 13:41:46.727 187082 DEBUG oslo_concurrency.lockutils [req-50699f49-9d43-448f-b350-a7c575bed428 req-f5b18833-c6c7-4a00-a81b-b5ac7023554a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:41:46 compute-1 nova_compute[187078]: 2025-11-24 13:41:46.727 187082 DEBUG oslo_concurrency.lockutils [req-50699f49-9d43-448f-b350-a7c575bed428 req-f5b18833-c6c7-4a00-a81b-b5ac7023554a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:41:46 compute-1 nova_compute[187078]: 2025-11-24 13:41:46.727 187082 DEBUG nova.compute.manager [req-50699f49-9d43-448f-b350-a7c575bed428 req-f5b18833-c6c7-4a00-a81b-b5ac7023554a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:41:46 compute-1 nova_compute[187078]: 2025-11-24 13:41:46.728 187082 WARNING nova.compute.manager [req-50699f49-9d43-448f-b350-a7c575bed428 req-f5b18833-c6c7-4a00-a81b-b5ac7023554a 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received unexpected event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with vm_state active and task_state None.
Nov 24 13:41:49 compute-1 openstack_network_exporter[199599]: ERROR   13:41:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:41:49 compute-1 openstack_network_exporter[199599]: ERROR   13:41:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:41:49 compute-1 openstack_network_exporter[199599]: ERROR   13:41:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:41:49 compute-1 nova_compute[187078]: 2025-11-24 13:41:49.462 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:49 compute-1 openstack_network_exporter[199599]: ERROR   13:41:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:41:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:41:49 compute-1 openstack_network_exporter[199599]: ERROR   13:41:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:41:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:41:49 compute-1 nova_compute[187078]: 2025-11-24 13:41:49.615 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:54 compute-1 nova_compute[187078]: 2025-11-24 13:41:54.464 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:54 compute-1 nova_compute[187078]: 2025-11-24 13:41:54.617 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:56 compute-1 sshd-session[217996]: Invalid user ubuntu from 61.240.213.113 port 50106
Nov 24 13:41:56 compute-1 sshd-session[217996]: Received disconnect from 61.240.213.113 port 50106:11:  [preauth]
Nov 24 13:41:56 compute-1 sshd-session[217996]: Disconnected from invalid user ubuntu 61.240.213.113 port 50106 [preauth]
Nov 24 13:41:58 compute-1 ovn_controller[95368]: 2025-11-24T13:41:58Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:fe:d4 10.100.0.14
Nov 24 13:41:58 compute-1 ovn_controller[95368]: 2025-11-24T13:41:58Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:fe:d4 10.100.0.14
Nov 24 13:41:59 compute-1 nova_compute[187078]: 2025-11-24 13:41:59.464 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:41:59 compute-1 nova_compute[187078]: 2025-11-24 13:41:59.619 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:00 compute-1 podman[218014]: 2025-11-24 13:42:00.503657936 +0000 UTC m=+0.056399358 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:42:00 compute-1 podman[218015]: 2025-11-24 13:42:00.508313641 +0000 UTC m=+0.052856882 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 24 13:42:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:04.170 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:42:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:04.170 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:42:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:04.171 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:42:04 compute-1 nova_compute[187078]: 2025-11-24 13:42:04.466 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:04 compute-1 nova_compute[187078]: 2025-11-24 13:42:04.621 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:05 compute-1 podman[197429]: time="2025-11-24T13:42:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:42:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:42:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:42:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:42:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 24 13:42:06 compute-1 podman[218058]: 2025-11-24 13:42:06.506586886 +0000 UTC m=+0.056536312 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 24 13:42:06 compute-1 sshd-session[218056]: Invalid user django from 68.183.82.237 port 60498
Nov 24 13:42:06 compute-1 podman[218059]: 2025-11-24 13:42:06.527688744 +0000 UTC m=+0.075542443 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:42:06 compute-1 sshd-session[218056]: Received disconnect from 68.183.82.237 port 60498:11: Bye Bye [preauth]
Nov 24 13:42:06 compute-1 sshd-session[218056]: Disconnected from invalid user django 68.183.82.237 port 60498 [preauth]
Nov 24 13:42:09 compute-1 nova_compute[187078]: 2025-11-24 13:42:09.468 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:09 compute-1 nova_compute[187078]: 2025-11-24 13:42:09.623 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:10 compute-1 ovn_controller[95368]: 2025-11-24T13:42:10Z|00216|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Nov 24 13:42:14 compute-1 nova_compute[187078]: 2025-11-24 13:42:14.470 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:14 compute-1 nova_compute[187078]: 2025-11-24 13:42:14.625 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:16 compute-1 podman[218106]: 2025-11-24 13:42:16.503686208 +0000 UTC m=+0.052527224 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 24 13:42:19 compute-1 openstack_network_exporter[199599]: ERROR   13:42:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:42:19 compute-1 openstack_network_exporter[199599]: ERROR   13:42:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:42:19 compute-1 openstack_network_exporter[199599]: ERROR   13:42:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:42:19 compute-1 openstack_network_exporter[199599]: ERROR   13:42:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:42:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:42:19 compute-1 openstack_network_exporter[199599]: ERROR   13:42:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:42:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:42:19 compute-1 nova_compute[187078]: 2025-11-24 13:42:19.471 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:19 compute-1 nova_compute[187078]: 2025-11-24 13:42:19.627 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:20 compute-1 nova_compute[187078]: 2025-11-24 13:42:20.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:21 compute-1 nova_compute[187078]: 2025-11-24 13:42:21.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:21 compute-1 sshd-session[218104]: Connection closed by 45.78.194.40 port 34010 [preauth]
Nov 24 13:42:24 compute-1 nova_compute[187078]: 2025-11-24 13:42:24.474 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:24 compute-1 nova_compute[187078]: 2025-11-24 13:42:24.629 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:25 compute-1 nova_compute[187078]: 2025-11-24 13:42:25.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:25 compute-1 nova_compute[187078]: 2025-11-24 13:42:25.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:27 compute-1 nova_compute[187078]: 2025-11-24 13:42:27.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.669 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.694 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.695 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.695 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.695 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.776 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.863 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.864 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:42:28 compute-1 nova_compute[187078]: 2025-11-24 13:42:28.941 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.105 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.106 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5721MB free_disk=73.42664337158203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.107 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.107 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.166 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 1c31b98e-6492-42f3-9dff-13cbc23050de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.166 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.166 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.205 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.216 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.232 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.232 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.517 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:29 compute-1 nova_compute[187078]: 2025-11-24 13:42:29.630 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:30 compute-1 nova_compute[187078]: 2025-11-24 13:42:30.232 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:30 compute-1 nova_compute[187078]: 2025-11-24 13:42:30.232 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:42:30 compute-1 nova_compute[187078]: 2025-11-24 13:42:30.232 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:42:30 compute-1 nova_compute[187078]: 2025-11-24 13:42:30.521 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:42:30 compute-1 nova_compute[187078]: 2025-11-24 13:42:30.521 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:42:30 compute-1 nova_compute[187078]: 2025-11-24 13:42:30.522 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:42:30 compute-1 nova_compute[187078]: 2025-11-24 13:42:30.522 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1c31b98e-6492-42f3-9dff-13cbc23050de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:42:31 compute-1 podman[218135]: 2025-11-24 13:42:31.515264212 +0000 UTC m=+0.061243867 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 24 13:42:31 compute-1 podman[218134]: 2025-11-24 13:42:31.515816128 +0000 UTC m=+0.061431683 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:42:31 compute-1 nova_compute[187078]: 2025-11-24 13:42:31.584 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Updating instance_info_cache with network_info: [{"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:42:31 compute-1 nova_compute[187078]: 2025-11-24 13:42:31.607 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:42:31 compute-1 nova_compute[187078]: 2025-11-24 13:42:31.608 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:42:34 compute-1 nova_compute[187078]: 2025-11-24 13:42:34.517 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:34 compute-1 nova_compute[187078]: 2025-11-24 13:42:34.631 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:35 compute-1 podman[197429]: time="2025-11-24T13:42:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:42:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:42:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:42:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:42:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Nov 24 13:42:37 compute-1 nova_compute[187078]: 2025-11-24 13:42:37.036 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:37 compute-1 nova_compute[187078]: 2025-11-24 13:42:37.037 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:42:37 compute-1 podman[218178]: 2025-11-24 13:42:37.125658115 +0000 UTC m=+0.064698631 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 13:42:37 compute-1 podman[218179]: 2025-11-24 13:42:37.176552793 +0000 UTC m=+0.101898511 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:42:39 compute-1 nova_compute[187078]: 2025-11-24 13:42:39.582 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:39 compute-1 nova_compute[187078]: 2025-11-24 13:42:39.633 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:44 compute-1 nova_compute[187078]: 2025-11-24 13:42:44.381 187082 DEBUG nova.compute.manager [None req-7051aa5a-81ac-4ac1-bfa4-ef2bb753a993 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Nov 24 13:42:44 compute-1 nova_compute[187078]: 2025-11-24 13:42:44.445 187082 DEBUG nova.compute.provider_tree [None req-7051aa5a-81ac-4ac1-bfa4-ef2bb753a993 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 42 to 46 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:42:44 compute-1 nova_compute[187078]: 2025-11-24 13:42:44.586 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:44 compute-1 nova_compute[187078]: 2025-11-24 13:42:44.635 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:44 compute-1 sshd-session[218225]: Invalid user sahil from 175.100.24.139 port 50502
Nov 24 13:42:44 compute-1 sshd-session[218225]: Received disconnect from 175.100.24.139 port 50502:11: Bye Bye [preauth]
Nov 24 13:42:44 compute-1 sshd-session[218225]: Disconnected from invalid user sahil 175.100.24.139 port 50502 [preauth]
Nov 24 13:42:47 compute-1 podman[218228]: 2025-11-24 13:42:47.522884281 +0000 UTC m=+0.074745192 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public)
Nov 24 13:42:48 compute-1 nova_compute[187078]: 2025-11-24 13:42:48.791 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Check if temp file /var/lib/nova/instances/tmpqgl0miin exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 24 13:42:48 compute-1 nova_compute[187078]: 2025-11-24 13:42:48.792 187082 DEBUG nova.compute.manager [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqgl0miin',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c31b98e-6492-42f3-9dff-13cbc23050de',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 24 13:42:49 compute-1 openstack_network_exporter[199599]: ERROR   13:42:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:42:49 compute-1 openstack_network_exporter[199599]: ERROR   13:42:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:42:49 compute-1 openstack_network_exporter[199599]: ERROR   13:42:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:42:49 compute-1 openstack_network_exporter[199599]: ERROR   13:42:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:42:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:42:49 compute-1 openstack_network_exporter[199599]: ERROR   13:42:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:42:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:42:49 compute-1 nova_compute[187078]: 2025-11-24 13:42:49.587 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:49 compute-1 nova_compute[187078]: 2025-11-24 13:42:49.636 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:49 compute-1 nova_compute[187078]: 2025-11-24 13:42:49.977 187082 DEBUG oslo_concurrency.processutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:42:50 compute-1 nova_compute[187078]: 2025-11-24 13:42:50.031 187082 DEBUG oslo_concurrency.processutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:42:50 compute-1 nova_compute[187078]: 2025-11-24 13:42:50.033 187082 DEBUG oslo_concurrency.processutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:42:50 compute-1 nova_compute[187078]: 2025-11-24 13:42:50.115 187082 DEBUG oslo_concurrency.processutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:42:54 compute-1 nova_compute[187078]: 2025-11-24 13:42:54.588 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:54 compute-1 nova_compute[187078]: 2025-11-24 13:42:54.638 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:55 compute-1 sshd-session[218255]: Accepted publickey for nova from 192.168.122.100 port 45824 ssh2: ECDSA SHA256:fPG+pnl3OphvY+8QA0gAR+psMyuz7dlZIDJLz5bJUDI
Nov 24 13:42:55 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Nov 24 13:42:55 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 24 13:42:55 compute-1 systemd-logind[815]: New session 47 of user nova.
Nov 24 13:42:55 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 24 13:42:55 compute-1 systemd[1]: Starting User Manager for UID 42436...
Nov 24 13:42:55 compute-1 systemd[218259]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:42:55 compute-1 systemd[218259]: Queued start job for default target Main User Target.
Nov 24 13:42:55 compute-1 systemd[218259]: Created slice User Application Slice.
Nov 24 13:42:55 compute-1 systemd[218259]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:42:55 compute-1 systemd[218259]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:42:55 compute-1 systemd[218259]: Reached target Paths.
Nov 24 13:42:55 compute-1 systemd[218259]: Reached target Timers.
Nov 24 13:42:55 compute-1 systemd[218259]: Starting D-Bus User Message Bus Socket...
Nov 24 13:42:55 compute-1 systemd[218259]: Starting Create User's Volatile Files and Directories...
Nov 24 13:42:55 compute-1 systemd[218259]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:42:55 compute-1 systemd[218259]: Reached target Sockets.
Nov 24 13:42:55 compute-1 systemd[218259]: Finished Create User's Volatile Files and Directories.
Nov 24 13:42:55 compute-1 systemd[218259]: Reached target Basic System.
Nov 24 13:42:55 compute-1 systemd[218259]: Reached target Main User Target.
Nov 24 13:42:55 compute-1 systemd[218259]: Startup finished in 149ms.
Nov 24 13:42:55 compute-1 systemd[1]: Started User Manager for UID 42436.
Nov 24 13:42:55 compute-1 systemd[1]: Started Session 47 of User nova.
Nov 24 13:42:55 compute-1 sshd-session[218255]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 24 13:42:55 compute-1 sshd-session[218274]: Received disconnect from 192.168.122.100 port 45824:11: disconnected by user
Nov 24 13:42:55 compute-1 sshd-session[218274]: Disconnected from user nova 192.168.122.100 port 45824
Nov 24 13:42:55 compute-1 sshd-session[218255]: pam_unix(sshd:session): session closed for user nova
Nov 24 13:42:55 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Nov 24 13:42:55 compute-1 systemd-logind[815]: Session 47 logged out. Waiting for processes to exit.
Nov 24 13:42:55 compute-1 systemd-logind[815]: Removed session 47.
Nov 24 13:42:56 compute-1 nova_compute[187078]: 2025-11-24 13:42:56.856 187082 DEBUG nova.compute.manager [req-47122e6a-f6ce-45ed-bb6c-cfdecee06d8d req-d4526523-6468-4ca8-8d12-9b355670b139 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:42:56 compute-1 nova_compute[187078]: 2025-11-24 13:42:56.857 187082 DEBUG oslo_concurrency.lockutils [req-47122e6a-f6ce-45ed-bb6c-cfdecee06d8d req-d4526523-6468-4ca8-8d12-9b355670b139 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:42:56 compute-1 nova_compute[187078]: 2025-11-24 13:42:56.857 187082 DEBUG oslo_concurrency.lockutils [req-47122e6a-f6ce-45ed-bb6c-cfdecee06d8d req-d4526523-6468-4ca8-8d12-9b355670b139 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:42:56 compute-1 nova_compute[187078]: 2025-11-24 13:42:56.857 187082 DEBUG oslo_concurrency.lockutils [req-47122e6a-f6ce-45ed-bb6c-cfdecee06d8d req-d4526523-6468-4ca8-8d12-9b355670b139 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:42:56 compute-1 nova_compute[187078]: 2025-11-24 13:42:56.858 187082 DEBUG nova.compute.manager [req-47122e6a-f6ce-45ed-bb6c-cfdecee06d8d req-d4526523-6468-4ca8-8d12-9b355670b139 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:42:56 compute-1 nova_compute[187078]: 2025-11-24 13:42:56.858 187082 DEBUG nova.compute.manager [req-47122e6a-f6ce-45ed-bb6c-cfdecee06d8d req-d4526523-6468-4ca8-8d12-9b355670b139 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:42:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:56.869 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:42:56 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:56.870 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:42:56 compute-1 nova_compute[187078]: 2025-11-24 13:42:56.888 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.636 187082 INFO nova.compute.manager [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Took 7.52 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.637 187082 DEBUG nova.compute.manager [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.659 187082 DEBUG nova.compute.manager [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqgl0miin',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c31b98e-6492-42f3-9dff-13cbc23050de',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(6bf27b09-9c1c-497a-a2c0-89429426c073),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.678 187082 DEBUG nova.objects.instance [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid 1c31b98e-6492-42f3-9dff-13cbc23050de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.678 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.680 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.680 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.694 187082 DEBUG nova.virt.libvirt.vif [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:41:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1794161449',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1794161449',id=26,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:41:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c88c7429e05d451c84ded4ece852815d',ramdisk_id='',reservation_id='r-t930adz6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:41:44Z,user_data=None,user_id='6c2c4276f4184eee9cbbb3ab03c0b339',uuid=1c31b98e-6492-42f3-9dff-13cbc23050de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.694 187082 DEBUG nova.network.os_vif_util [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.695 187082 DEBUG nova.network.os_vif_util [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:fe:d4,bridge_name='br-int',has_traffic_filtering=True,id=6e12565f-48b1-4286-a5de-b083a7a7691d,network=Network(3c401877-c5ee-45c7-b1e0-8a6c54929af1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e12565f-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.696 187082 DEBUG nova.virt.libvirt.migration [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Updating guest XML with vif config: <interface type="ethernet">
Nov 24 13:42:57 compute-1 nova_compute[187078]:   <mac address="fa:16:3e:5b:fe:d4"/>
Nov 24 13:42:57 compute-1 nova_compute[187078]:   <model type="virtio"/>
Nov 24 13:42:57 compute-1 nova_compute[187078]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:42:57 compute-1 nova_compute[187078]:   <mtu size="1442"/>
Nov 24 13:42:57 compute-1 nova_compute[187078]:   <target dev="tap6e12565f-48"/>
Nov 24 13:42:57 compute-1 nova_compute[187078]: </interface>
Nov 24 13:42:57 compute-1 nova_compute[187078]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 24 13:42:57 compute-1 nova_compute[187078]: 2025-11-24 13:42:57.696 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.182 187082 DEBUG nova.virt.libvirt.migration [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.183 187082 INFO nova.virt.libvirt.migration [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.266 187082 INFO nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.769 187082 DEBUG nova.virt.libvirt.migration [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.770 187082 DEBUG nova.virt.libvirt.migration [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.940 187082 DEBUG nova.compute.manager [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.940 187082 DEBUG oslo_concurrency.lockutils [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.941 187082 DEBUG oslo_concurrency.lockutils [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.941 187082 DEBUG oslo_concurrency.lockutils [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.942 187082 DEBUG nova.compute.manager [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.942 187082 WARNING nova.compute.manager [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received unexpected event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with vm_state active and task_state migrating.
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.942 187082 DEBUG nova.compute.manager [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-changed-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.943 187082 DEBUG nova.compute.manager [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Refreshing instance network info cache due to event network-changed-6e12565f-48b1-4286-a5de-b083a7a7691d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.944 187082 DEBUG oslo_concurrency.lockutils [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.944 187082 DEBUG oslo_concurrency.lockutils [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:42:58 compute-1 nova_compute[187078]: 2025-11-24 13:42:58.944 187082 DEBUG nova.network.neutron [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Refreshing network info cache for port 6e12565f-48b1-4286-a5de-b083a7a7691d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.230 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991779.2299128, 1c31b98e-6492-42f3-9dff-13cbc23050de => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.231 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] VM Paused (Lifecycle Event)
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.250 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.255 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.272 187082 DEBUG nova.virt.libvirt.migration [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.273 187082 DEBUG nova.virt.libvirt.migration [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.274 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 24 13:42:59 compute-1 kernel: tap6e12565f-48 (unregistering): left promiscuous mode
Nov 24 13:42:59 compute-1 NetworkManager[55527]: <info>  [1763991779.3899] device (tap6e12565f-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:42:59 compute-1 ovn_controller[95368]: 2025-11-24T13:42:59Z|00217|binding|INFO|Releasing lport 6e12565f-48b1-4286-a5de-b083a7a7691d from this chassis (sb_readonly=0)
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.445 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:59 compute-1 ovn_controller[95368]: 2025-11-24T13:42:59Z|00218|binding|INFO|Setting lport 6e12565f-48b1-4286-a5de-b083a7a7691d down in Southbound
Nov 24 13:42:59 compute-1 ovn_controller[95368]: 2025-11-24T13:42:59Z|00219|binding|INFO|Removing iface tap6e12565f-48 ovn-installed in OVS
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.449 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.454 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:fe:d4 10.100.0.14'], port_security=['fa:16:3e:5b:fe:d4 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '9f02b040-31e6-4504-b049-75d1186dcdf1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1c31b98e-6492-42f3-9dff-13cbc23050de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c401877-c5ee-45c7-b1e0-8a6c54929af1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c88c7429e05d451c84ded4ece852815d', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'fa6db478-7702-44a4-bd64-d0b43349d3b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50d784e-9814-4b17-8c71-48498f64d246, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=6e12565f-48b1-4286-a5de-b083a7a7691d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.457 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 6e12565f-48b1-4286-a5de-b083a7a7691d in datapath 3c401877-c5ee-45c7-b1e0-8a6c54929af1 unbound from our chassis
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.460 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c401877-c5ee-45c7-b1e0-8a6c54929af1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.462 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2e75e2a4-a2bc-42fb-9f2c-e1a7b3eff61d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.463 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1 namespace which is not needed anymore
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.473 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:59 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Nov 24 13:42:59 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001a.scope: Consumed 16.145s CPU time.
Nov 24 13:42:59 compute-1 systemd-machined[153355]: Machine qemu-18-instance-0000001a terminated.
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.591 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:59 compute-1 neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1[217957]: [NOTICE]   (217961) : haproxy version is 2.8.14-c23fe91
Nov 24 13:42:59 compute-1 neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1[217957]: [NOTICE]   (217961) : path to executable is /usr/sbin/haproxy
Nov 24 13:42:59 compute-1 neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1[217957]: [WARNING]  (217961) : Exiting Master process...
Nov 24 13:42:59 compute-1 neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1[217957]: [WARNING]  (217961) : Exiting Master process...
Nov 24 13:42:59 compute-1 neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1[217957]: [ALERT]    (217961) : Current worker (217963) exited with code 143 (Terminated)
Nov 24 13:42:59 compute-1 neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1[217957]: [WARNING]  (217961) : All workers exited. Exiting... (0)
Nov 24 13:42:59 compute-1 systemd[1]: libpod-f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8.scope: Deactivated successfully.
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.633 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.633 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.634 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 24 13:42:59 compute-1 podman[218319]: 2025-11-24 13:42:59.634996588 +0000 UTC m=+0.066736755 container died f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.640 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:59 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8-userdata-shm.mount: Deactivated successfully.
Nov 24 13:42:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-8e7da0b2678acb9c1192aae54868f7bb0480af3568bab1d6df85096f92d7e11d-merged.mount: Deactivated successfully.
Nov 24 13:42:59 compute-1 podman[218319]: 2025-11-24 13:42:59.678506065 +0000 UTC m=+0.110246242 container cleanup f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:42:59 compute-1 systemd[1]: libpod-conmon-f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8.scope: Deactivated successfully.
Nov 24 13:42:59 compute-1 podman[218369]: 2025-11-24 13:42:59.750480501 +0000 UTC m=+0.046996402 container remove f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.757 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[7c615079-a0b8-47ff-839c-eeb88898ca3b]: (4, ('Mon Nov 24 01:42:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1 (f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8)\nf9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8\nMon Nov 24 01:42:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1 (f9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8)\nf9ae28ad5de6a0b6c5cc41f7eb438a0cc3660355eb12bbea2bda3ad5a53690c8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.761 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5c9b74f3-8c0c-4de8-b278-e132cd65ff98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.762 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c401877-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.765 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:59 compute-1 kernel: tap3c401877-c0: left promiscuous mode
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.776 187082 DEBUG nova.virt.libvirt.guest [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '1c31b98e-6492-42f3-9dff-13cbc23050de' (instance-0000001a) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.776 187082 INFO nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Migration operation has completed
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.777 187082 INFO nova.compute.manager [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] _post_live_migration() is started..
Nov 24 13:42:59 compute-1 nova_compute[187078]: 2025-11-24 13:42:59.781 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.784 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[135387c7-016a-4732-93b7-d197cc7ed92d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.802 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0362243e-954d-4155-8c83-cd6aebc422fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.803 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f744d2ac-28fb-4336-818d-f0eb583c9e2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.824 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[99ea5e04-f83d-4624-b7c0-5619ece9c058]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471920, 'reachable_time': 35499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218388, 'error': None, 'target': 'ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:42:59 compute-1 systemd[1]: run-netns-ovnmeta\x2d3c401877\x2dc5ee\x2d45c7\x2db1e0\x2d8a6c54929af1.mount: Deactivated successfully.
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.828 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c401877-c5ee-45c7-b1e0-8a6c54929af1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:42:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:42:59.829 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[96b63b57-2a16-4e19-a2ce-b58b86bd101c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.012 187082 DEBUG nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.012 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.013 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.014 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.014 187082 DEBUG nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.014 187082 DEBUG nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.015 187082 DEBUG nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.015 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.015 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.016 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.016 187082 DEBUG nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.016 187082 WARNING nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received unexpected event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with vm_state active and task_state migrating.
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.016 187082 DEBUG nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.017 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.017 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.017 187082 DEBUG oslo_concurrency.lockutils [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.017 187082 DEBUG nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.018 187082 WARNING nova.compute.manager [req-c31acca7-8249-4ecd-a8d5-26ca6c43455b req-5f4bbca7-141c-4ab6-896d-3d98a9a7a1d4 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received unexpected event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with vm_state active and task_state migrating.
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.672 187082 DEBUG nova.network.neutron [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Updated VIF entry in instance network info cache for port 6e12565f-48b1-4286-a5de-b083a7a7691d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.672 187082 DEBUG nova.network.neutron [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Updating instance_info_cache with network_info: [{"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.690 187082 DEBUG oslo_concurrency.lockutils [req-c3fd530d-7dd3-47f1-aac6-fc1c45cc5fc8 req-fd2ae110-8a5a-4c44-8888-efcae4cd82d0 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-1c31b98e-6492-42f3-9dff-13cbc23050de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.694 187082 DEBUG nova.network.neutron [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Activated binding for port 6e12565f-48b1-4286-a5de-b083a7a7691d and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.695 187082 DEBUG nova.compute.manager [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.696 187082 DEBUG nova.virt.libvirt.vif [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:41:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1794161449',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1794161449',id=26,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:41:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c88c7429e05d451c84ded4ece852815d',ramdisk_id='',reservation_id='r-t930adz6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2064657928-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:42:46Z,user_data=None,user_id='6c2c4276f4184eee9cbbb3ab03c0b339',uuid=1c31b98e-6492-42f3-9dff-13cbc23050de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.697 187082 DEBUG nova.network.os_vif_util [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "6e12565f-48b1-4286-a5de-b083a7a7691d", "address": "fa:16:3e:5b:fe:d4", "network": {"id": "3c401877-c5ee-45c7-b1e0-8a6c54929af1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1618634624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c88c7429e05d451c84ded4ece852815d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e12565f-48", "ovs_interfaceid": "6e12565f-48b1-4286-a5de-b083a7a7691d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.698 187082 DEBUG nova.network.os_vif_util [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:fe:d4,bridge_name='br-int',has_traffic_filtering=True,id=6e12565f-48b1-4286-a5de-b083a7a7691d,network=Network(3c401877-c5ee-45c7-b1e0-8a6c54929af1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e12565f-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.699 187082 DEBUG os_vif [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:fe:d4,bridge_name='br-int',has_traffic_filtering=True,id=6e12565f-48b1-4286-a5de-b083a7a7691d,network=Network(3c401877-c5ee-45c7-b1e0-8a6c54929af1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e12565f-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.701 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.702 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e12565f-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.704 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.708 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.710 187082 INFO os_vif [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:fe:d4,bridge_name='br-int',has_traffic_filtering=True,id=6e12565f-48b1-4286-a5de-b083a7a7691d,network=Network(3c401877-c5ee-45c7-b1e0-8a6c54929af1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e12565f-48')
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.710 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.711 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.711 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.711 187082 DEBUG nova.compute.manager [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.712 187082 INFO nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Deleting instance files /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de_del
Nov 24 13:43:01 compute-1 nova_compute[187078]: 2025-11-24 13:43:01.712 187082 INFO nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Deletion of /var/lib/nova/instances/1c31b98e-6492-42f3-9dff-13cbc23050de_del complete
Nov 24 13:43:02 compute-1 podman[218390]: 2025-11-24 13:43:02.500615075 +0000 UTC m=+0.051572295 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:43:02 compute-1 podman[218391]: 2025-11-24 13:43:02.500729038 +0000 UTC m=+0.049400996 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.091 187082 DEBUG nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.092 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.092 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.092 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.092 187082 DEBUG nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.092 187082 DEBUG nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-unplugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.093 187082 DEBUG nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.093 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.093 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.093 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.093 187082 DEBUG nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.093 187082 WARNING nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received unexpected event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with vm_state active and task_state migrating.
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.093 187082 DEBUG nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.094 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.094 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.094 187082 DEBUG oslo_concurrency.lockutils [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.094 187082 DEBUG nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] No waiting events found dispatching network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:43:03 compute-1 nova_compute[187078]: 2025-11-24 13:43:03.094 187082 WARNING nova.compute.manager [req-857e32fe-99b8-468a-a732-626b113fc69f req-d3de40a2-67e1-40bc-b415-bcc38b8cd2ee 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Received unexpected event network-vif-plugged-6e12565f-48b1-4286-a5de-b083a7a7691d for instance with vm_state active and task_state migrating.
Nov 24 13:43:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:43:04.171 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:43:04.172 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:43:04.172 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:04 compute-1 nova_compute[187078]: 2025-11-24 13:43:04.596 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:05 compute-1 podman[197429]: time="2025-11-24T13:43:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:43:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:43:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:43:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:43:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Nov 24 13:43:06 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Nov 24 13:43:06 compute-1 systemd[218259]: Activating special unit Exit the Session...
Nov 24 13:43:06 compute-1 systemd[218259]: Stopped target Main User Target.
Nov 24 13:43:06 compute-1 systemd[218259]: Stopped target Basic System.
Nov 24 13:43:06 compute-1 systemd[218259]: Stopped target Paths.
Nov 24 13:43:06 compute-1 systemd[218259]: Stopped target Sockets.
Nov 24 13:43:06 compute-1 systemd[218259]: Stopped target Timers.
Nov 24 13:43:06 compute-1 systemd[218259]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:43:06 compute-1 systemd[218259]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 13:43:06 compute-1 systemd[218259]: Closed D-Bus User Message Bus Socket.
Nov 24 13:43:06 compute-1 systemd[218259]: Stopped Create User's Volatile Files and Directories.
Nov 24 13:43:06 compute-1 systemd[218259]: Removed slice User Application Slice.
Nov 24 13:43:06 compute-1 systemd[218259]: Reached target Shutdown.
Nov 24 13:43:06 compute-1 systemd[218259]: Finished Exit the Session.
Nov 24 13:43:06 compute-1 systemd[218259]: Reached target Exit the Session.
Nov 24 13:43:06 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Nov 24 13:43:06 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Nov 24 13:43:06 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 24 13:43:06 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 24 13:43:06 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 24 13:43:06 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 24 13:43:06 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.620 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.621 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.622 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "1c31b98e-6492-42f3-9dff-13cbc23050de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.642 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.642 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:06 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.642 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:06 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.643 187082 DEBUG nova.compute.resource_tracker [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.705 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.784 187082 WARNING nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.786 187082 DEBUG nova.compute.resource_tracker [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5876MB free_disk=73.45546340942383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.786 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.786 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.869 187082 DEBUG nova.compute.resource_tracker [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration for instance 1c31b98e-6492-42f3-9dff-13cbc23050de refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 24 13:43:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:43:06.873 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.894 187082 DEBUG nova.compute.resource_tracker [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.928 187082 DEBUG nova.compute.resource_tracker [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Migration 6bf27b09-9c1c-497a-a2c0-89429426c073 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.929 187082 DEBUG nova.compute.resource_tracker [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.929 187082 DEBUG nova.compute.resource_tracker [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:43:06 compute-1 nova_compute[187078]: 2025-11-24 13:43:06.983 187082 DEBUG nova.compute.provider_tree [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:43:07 compute-1 nova_compute[187078]: 2025-11-24 13:43:07.001 187082 DEBUG nova.scheduler.client.report [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:43:07 compute-1 nova_compute[187078]: 2025-11-24 13:43:07.023 187082 DEBUG nova.compute.resource_tracker [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:43:07 compute-1 nova_compute[187078]: 2025-11-24 13:43:07.024 187082 DEBUG oslo_concurrency.lockutils [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:07 compute-1 nova_compute[187078]: 2025-11-24 13:43:07.030 187082 INFO nova.compute.manager [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Nov 24 13:43:07 compute-1 nova_compute[187078]: 2025-11-24 13:43:07.111 187082 INFO nova.scheduler.client.report [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Deleted allocation for migration 6bf27b09-9c1c-497a-a2c0-89429426c073
Nov 24 13:43:07 compute-1 nova_compute[187078]: 2025-11-24 13:43:07.112 187082 DEBUG nova.virt.libvirt.driver [None req-3d975ed0-7455-4d41-b80b-2af4f9d41ddf 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 24 13:43:07 compute-1 podman[218433]: 2025-11-24 13:43:07.535519235 +0000 UTC m=+0.082575234 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:43:07 compute-1 podman[218434]: 2025-11-24 13:43:07.56273713 +0000 UTC m=+0.104262399 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 24 13:43:09 compute-1 nova_compute[187078]: 2025-11-24 13:43:09.597 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:11 compute-1 nova_compute[187078]: 2025-11-24 13:43:11.708 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:14 compute-1 nova_compute[187078]: 2025-11-24 13:43:14.601 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:14 compute-1 nova_compute[187078]: 2025-11-24 13:43:14.629 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991779.6294644, 1c31b98e-6492-42f3-9dff-13cbc23050de => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:43:14 compute-1 nova_compute[187078]: 2025-11-24 13:43:14.630 187082 INFO nova.compute.manager [-] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] VM Stopped (Lifecycle Event)
Nov 24 13:43:14 compute-1 nova_compute[187078]: 2025-11-24 13:43:14.655 187082 DEBUG nova.compute.manager [None req-8f689d1d-df27-47d4-9cd5-f4d839dda037 - - - - - -] [instance: 1c31b98e-6492-42f3-9dff-13cbc23050de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:43:16 compute-1 nova_compute[187078]: 2025-11-24 13:43:16.709 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:18 compute-1 podman[218478]: 2025-11-24 13:43:18.528625983 +0000 UTC m=+0.068969506 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=edpm, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 24 13:43:19 compute-1 openstack_network_exporter[199599]: ERROR   13:43:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:43:19 compute-1 openstack_network_exporter[199599]: ERROR   13:43:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:43:19 compute-1 openstack_network_exporter[199599]: ERROR   13:43:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:43:19 compute-1 openstack_network_exporter[199599]: ERROR   13:43:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:43:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:43:19 compute-1 openstack_network_exporter[199599]: ERROR   13:43:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:43:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:43:19 compute-1 nova_compute[187078]: 2025-11-24 13:43:19.603 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:21 compute-1 nova_compute[187078]: 2025-11-24 13:43:21.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:21 compute-1 nova_compute[187078]: 2025-11-24 13:43:21.712 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:23 compute-1 nova_compute[187078]: 2025-11-24 13:43:23.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:24 compute-1 nova_compute[187078]: 2025-11-24 13:43:24.604 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:25 compute-1 nova_compute[187078]: 2025-11-24 13:43:25.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:25 compute-1 nova_compute[187078]: 2025-11-24 13:43:25.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:26 compute-1 nova_compute[187078]: 2025-11-24 13:43:26.752 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:28 compute-1 nova_compute[187078]: 2025-11-24 13:43:28.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.606 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.685 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.685 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.686 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.686 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.858 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.859 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5885MB free_disk=73.45546340942383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.859 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.860 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.912 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.912 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.944 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.956 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.957 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:43:29 compute-1 nova_compute[187078]: 2025-11-24 13:43:29.957 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:43:30 compute-1 nova_compute[187078]: 2025-11-24 13:43:30.957 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:30 compute-1 nova_compute[187078]: 2025-11-24 13:43:30.958 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:43:31 compute-1 nova_compute[187078]: 2025-11-24 13:43:31.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:31 compute-1 nova_compute[187078]: 2025-11-24 13:43:31.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:43:31 compute-1 nova_compute[187078]: 2025-11-24 13:43:31.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:43:31 compute-1 nova_compute[187078]: 2025-11-24 13:43:31.677 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:43:31 compute-1 nova_compute[187078]: 2025-11-24 13:43:31.754 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:32 compute-1 nova_compute[187078]: 2025-11-24 13:43:32.800 187082 DEBUG nova.compute.manager [None req-241f37c3-c90b-49ba-882e-74fc17dbc105 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider ece8f004-1d5b-407f-a713-f9e87706b045 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Nov 24 13:43:32 compute-1 nova_compute[187078]: 2025-11-24 13:43:32.864 187082 DEBUG nova.compute.provider_tree [None req-241f37c3-c90b-49ba-882e-74fc17dbc105 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Updating resource provider ece8f004-1d5b-407f-a713-f9e87706b045 generation from 46 to 49 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 13:43:33 compute-1 podman[218501]: 2025-11-24 13:43:33.534627011 +0000 UTC m=+0.074757301 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:43:33 compute-1 podman[218502]: 2025-11-24 13:43:33.546741149 +0000 UTC m=+0.090248041 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 24 13:43:33 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:43:33.999 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:43:34 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:43:34.000 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:43:34 compute-1 nova_compute[187078]: 2025-11-24 13:43:34.053 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:34 compute-1 nova_compute[187078]: 2025-11-24 13:43:34.608 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:35 compute-1 podman[197429]: time="2025-11-24T13:43:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:43:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:43:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:43:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:43:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 24 13:43:36 compute-1 nova_compute[187078]: 2025-11-24 13:43:36.670 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:43:36 compute-1 nova_compute[187078]: 2025-11-24 13:43:36.756 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:37 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:43:37.002 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:43:37 compute-1 nova_compute[187078]: 2025-11-24 13:43:37.067 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:38 compute-1 podman[218542]: 2025-11-24 13:43:38.525203292 +0000 UTC m=+0.067627189 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 13:43:38 compute-1 podman[218543]: 2025-11-24 13:43:38.537286529 +0000 UTC m=+0.081205897 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller)
Nov 24 13:43:39 compute-1 nova_compute[187078]: 2025-11-24 13:43:39.609 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:41 compute-1 nova_compute[187078]: 2025-11-24 13:43:41.795 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:44 compute-1 nova_compute[187078]: 2025-11-24 13:43:44.612 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:46 compute-1 nova_compute[187078]: 2025-11-24 13:43:46.797 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:47 compute-1 sshd-session[218589]: Invalid user sol from 45.148.10.240 port 36294
Nov 24 13:43:47 compute-1 sshd-session[218589]: Connection closed by invalid user sol 45.148.10.240 port 36294 [preauth]
Nov 24 13:43:49 compute-1 openstack_network_exporter[199599]: ERROR   13:43:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:43:49 compute-1 openstack_network_exporter[199599]: ERROR   13:43:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:43:49 compute-1 openstack_network_exporter[199599]: ERROR   13:43:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:43:49 compute-1 openstack_network_exporter[199599]: ERROR   13:43:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:43:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:43:49 compute-1 openstack_network_exporter[199599]: ERROR   13:43:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:43:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:43:49 compute-1 podman[218591]: 2025-11-24 13:43:49.539861433 +0000 UTC m=+0.082962534 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Nov 24 13:43:49 compute-1 nova_compute[187078]: 2025-11-24 13:43:49.613 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:51 compute-1 nova_compute[187078]: 2025-11-24 13:43:51.799 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:54 compute-1 nova_compute[187078]: 2025-11-24 13:43:54.615 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:56 compute-1 nova_compute[187078]: 2025-11-24 13:43:56.849 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:43:59 compute-1 nova_compute[187078]: 2025-11-24 13:43:59.617 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:01 compute-1 nova_compute[187078]: 2025-11-24 13:44:01.851 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:04.172 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:04.173 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:04.173 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:04 compute-1 podman[218613]: 2025-11-24 13:44:04.511621897 +0000 UTC m=+0.052017528 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:44:04 compute-1 podman[218614]: 2025-11-24 13:44:04.521569346 +0000 UTC m=+0.055932913 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 24 13:44:04 compute-1 nova_compute[187078]: 2025-11-24 13:44:04.617 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:05 compute-1 podman[197429]: time="2025-11-24T13:44:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:44:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:44:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:44:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:44:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 24 13:44:06 compute-1 nova_compute[187078]: 2025-11-24 13:44:06.854 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:08 compute-1 nova_compute[187078]: 2025-11-24 13:44:08.907 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "0c1bcc72-9195-4f95-9651-6398445db523" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:08 compute-1 nova_compute[187078]: 2025-11-24 13:44:08.908 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:08 compute-1 nova_compute[187078]: 2025-11-24 13:44:08.921 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 13:44:08 compute-1 nova_compute[187078]: 2025-11-24 13:44:08.989 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:08 compute-1 nova_compute[187078]: 2025-11-24 13:44:08.990 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.000 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.000 187082 INFO nova.compute.claims [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Claim successful on node compute-1.ctlplane.example.com
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.083 187082 DEBUG nova.compute.provider_tree [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.094 187082 DEBUG nova.scheduler.client.report [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.117 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.118 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.164 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.164 187082 DEBUG nova.network.neutron [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.192 187082 INFO nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.222 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.322 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.323 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.324 187082 INFO nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Creating image(s)
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.324 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "/var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.324 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "/var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.325 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "/var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.336 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.391 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.392 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.393 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.405 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.460 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.461 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.501 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.502 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.502 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:09 compute-1 podman[218659]: 2025-11-24 13:44:09.563028203 +0000 UTC m=+0.097777145 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS)
Nov 24 13:44:09 compute-1 podman[218660]: 2025-11-24 13:44:09.566227269 +0000 UTC m=+0.097113975 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.569 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.570 187082 DEBUG nova.virt.disk.api [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Checking if we can resize image /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.571 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.618 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.623 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.624 187082 DEBUG nova.virt.disk.api [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Cannot resize image /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.624 187082 DEBUG nova.objects.instance [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lazy-loading 'migration_context' on Instance uuid 0c1bcc72-9195-4f95-9651-6398445db523 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.636 187082 DEBUG nova.policy [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'edfedf0b61034f078802fdb1dc050c75', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2784a47a2a14067b899db14fb90fc19', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.639 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.639 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Ensure instance console log exists: /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.640 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.640 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:09 compute-1 nova_compute[187078]: 2025-11-24 13:44:09.640 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:10 compute-1 nova_compute[187078]: 2025-11-24 13:44:10.926 187082 DEBUG nova.network.neutron [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Successfully created port: 37bcaf75-ab39-43e9-8516-af26ec1703a2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 13:44:11 compute-1 nova_compute[187078]: 2025-11-24 13:44:11.906 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:12 compute-1 ovn_controller[95368]: 2025-11-24T13:44:12Z|00220|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Nov 24 13:44:12 compute-1 nova_compute[187078]: 2025-11-24 13:44:12.664 187082 DEBUG nova.network.neutron [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Successfully updated port: 37bcaf75-ab39-43e9-8516-af26ec1703a2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 13:44:12 compute-1 nova_compute[187078]: 2025-11-24 13:44:12.679 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:44:12 compute-1 nova_compute[187078]: 2025-11-24 13:44:12.680 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquired lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:44:12 compute-1 nova_compute[187078]: 2025-11-24 13:44:12.680 187082 DEBUG nova.network.neutron [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:44:12 compute-1 nova_compute[187078]: 2025-11-24 13:44:12.801 187082 DEBUG nova.compute.manager [req-4470427f-ab08-4449-8e8f-60cdc5fd579b req-8d9337bf-c01b-44af-af4b-5fd0d8acf454 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received event network-changed-37bcaf75-ab39-43e9-8516-af26ec1703a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:44:12 compute-1 nova_compute[187078]: 2025-11-24 13:44:12.801 187082 DEBUG nova.compute.manager [req-4470427f-ab08-4449-8e8f-60cdc5fd579b req-8d9337bf-c01b-44af-af4b-5fd0d8acf454 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Refreshing instance network info cache due to event network-changed-37bcaf75-ab39-43e9-8516-af26ec1703a2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 13:44:12 compute-1 nova_compute[187078]: 2025-11-24 13:44:12.801 187082 DEBUG oslo_concurrency.lockutils [req-4470427f-ab08-4449-8e8f-60cdc5fd579b req-8d9337bf-c01b-44af-af4b-5fd0d8acf454 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:44:12 compute-1 nova_compute[187078]: 2025-11-24 13:44:12.840 187082 DEBUG nova.network.neutron [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.038 187082 DEBUG nova.network.neutron [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Updating instance_info_cache with network_info: [{"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.053 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Releasing lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.053 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Instance network_info: |[{"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.054 187082 DEBUG oslo_concurrency.lockutils [req-4470427f-ab08-4449-8e8f-60cdc5fd579b req-8d9337bf-c01b-44af-af4b-5fd0d8acf454 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.054 187082 DEBUG nova.network.neutron [req-4470427f-ab08-4449-8e8f-60cdc5fd579b req-8d9337bf-c01b-44af-af4b-5fd0d8acf454 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Refreshing network info cache for port 37bcaf75-ab39-43e9-8516-af26ec1703a2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.056 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Start _get_guest_xml network_info=[{"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'image_id': '1d4afc77-cb95-49a2-9165-f8ceca2998fc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.061 187082 WARNING nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.068 187082 DEBUG nova.virt.libvirt.host [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.068 187082 DEBUG nova.virt.libvirt.host [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.071 187082 DEBUG nova.virt.libvirt.host [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.072 187082 DEBUG nova.virt.libvirt.host [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.074 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.075 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T13:14:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9fb1ccae-4ba6-4040-a754-0b156b72dc25',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T13:14:39Z,direct_url=<?>,disk_format='qcow2',id=1d4afc77-cb95-49a2-9165-f8ceca2998fc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6c72cc3d41144d349210cde7f8024bbf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T13:14:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.076 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.076 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.077 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.077 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.078 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.078 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.079 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.079 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.080 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.080 187082 DEBUG nova.virt.hardware [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.088 187082 DEBUG nova.virt.libvirt.vif [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:44:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1704049702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1704049702',id=27,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2784a47a2a14067b899db14fb90fc19',ramdisk_id='',reservation_id='r-993a4j0q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:44:09Z,user_data=None,user_id='edfedf0b61034f078802fdb1dc050c75',uuid=0c1bcc72-9195-4f95-9651-6398445db523,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.089 187082 DEBUG nova.network.os_vif_util [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Converting VIF {"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.090 187082 DEBUG nova.network.os_vif_util [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b9:67,bridge_name='br-int',has_traffic_filtering=True,id=37bcaf75-ab39-43e9-8516-af26ec1703a2,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcaf75-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.092 187082 DEBUG nova.objects.instance [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0c1bcc72-9195-4f95-9651-6398445db523 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.108 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] End _get_guest_xml xml=<domain type="kvm">
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <uuid>0c1bcc72-9195-4f95-9651-6398445db523</uuid>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <name>instance-0000001b</name>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <memory>131072</memory>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <vcpu>1</vcpu>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <metadata>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-1704049702</nova:name>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <nova:creationTime>2025-11-24 13:44:14</nova:creationTime>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <nova:flavor name="m1.nano">
Nov 24 13:44:14 compute-1 nova_compute[187078]:         <nova:memory>128</nova:memory>
Nov 24 13:44:14 compute-1 nova_compute[187078]:         <nova:disk>1</nova:disk>
Nov 24 13:44:14 compute-1 nova_compute[187078]:         <nova:swap>0</nova:swap>
Nov 24 13:44:14 compute-1 nova_compute[187078]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 13:44:14 compute-1 nova_compute[187078]:         <nova:vcpus>1</nova:vcpus>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       </nova:flavor>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <nova:owner>
Nov 24 13:44:14 compute-1 nova_compute[187078]:         <nova:user uuid="edfedf0b61034f078802fdb1dc050c75">tempest-TestExecuteWorkloadBalancingStrategy-1463119205-project-member</nova:user>
Nov 24 13:44:14 compute-1 nova_compute[187078]:         <nova:project uuid="c2784a47a2a14067b899db14fb90fc19">tempest-TestExecuteWorkloadBalancingStrategy-1463119205</nova:project>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       </nova:owner>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <nova:root type="image" uuid="1d4afc77-cb95-49a2-9165-f8ceca2998fc"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <nova:ports>
Nov 24 13:44:14 compute-1 nova_compute[187078]:         <nova:port uuid="37bcaf75-ab39-43e9-8516-af26ec1703a2">
Nov 24 13:44:14 compute-1 nova_compute[187078]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:         </nova:port>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       </nova:ports>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </nova:instance>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   </metadata>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <sysinfo type="smbios">
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <system>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <entry name="manufacturer">RDO</entry>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <entry name="product">OpenStack Compute</entry>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <entry name="serial">0c1bcc72-9195-4f95-9651-6398445db523</entry>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <entry name="uuid">0c1bcc72-9195-4f95-9651-6398445db523</entry>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <entry name="family">Virtual Machine</entry>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </system>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   </sysinfo>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <os>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <boot dev="hd"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <smbios mode="sysinfo"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   </os>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <features>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <acpi/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <apic/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <vmcoreinfo/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   </features>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <clock offset="utc">
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <timer name="hpet" present="no"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   </clock>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <cpu mode="custom" match="exact">
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <model>Nehalem</model>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   </cpu>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   <devices>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <disk type="file" device="disk">
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <target dev="vda" bus="virtio"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <disk type="file" device="cdrom">
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <source file="/var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk.config"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <target dev="sda" bus="sata"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </disk>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <interface type="ethernet">
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <mac address="fa:16:3e:5a:b9:67"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <mtu size="1442"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <target dev="tap37bcaf75-ab"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </interface>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <serial type="pty">
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <log file="/var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/console.log" append="off"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </serial>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <video>
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <model type="virtio"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </video>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <input type="tablet" bus="usb"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <rng model="virtio">
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <backend model="random">/dev/urandom</backend>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </rng>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <controller type="usb" index="0"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     <memballoon model="virtio">
Nov 24 13:44:14 compute-1 nova_compute[187078]:       <stats period="10"/>
Nov 24 13:44:14 compute-1 nova_compute[187078]:     </memballoon>
Nov 24 13:44:14 compute-1 nova_compute[187078]:   </devices>
Nov 24 13:44:14 compute-1 nova_compute[187078]: </domain>
Nov 24 13:44:14 compute-1 nova_compute[187078]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.109 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Preparing to wait for external event network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.110 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "0c1bcc72-9195-4f95-9651-6398445db523-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.110 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.110 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.111 187082 DEBUG nova.virt.libvirt.vif [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T13:44:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1704049702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1704049702',id=27,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c2784a47a2a14067b899db14fb90fc19',ramdisk_id='',reservation_id='r-993a4j0q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:44:09Z,user_data=None,user_id='edfedf0b61034f078802fdb1dc050c75',uuid=0c1bcc72-9195-4f95-9651-6398445db523,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.111 187082 DEBUG nova.network.os_vif_util [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Converting VIF {"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.112 187082 DEBUG nova.network.os_vif_util [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b9:67,bridge_name='br-int',has_traffic_filtering=True,id=37bcaf75-ab39-43e9-8516-af26ec1703a2,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcaf75-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.113 187082 DEBUG os_vif [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b9:67,bridge_name='br-int',has_traffic_filtering=True,id=37bcaf75-ab39-43e9-8516-af26ec1703a2,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcaf75-ab') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.113 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.114 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.114 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.117 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.118 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37bcaf75-ab, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.118 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap37bcaf75-ab, col_values=(('external_ids', {'iface-id': '37bcaf75-ab39-43e9-8516-af26ec1703a2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:b9:67', 'vm-uuid': '0c1bcc72-9195-4f95-9651-6398445db523'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.120 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:14 compute-1 NetworkManager[55527]: <info>  [1763991854.1220] manager: (tap37bcaf75-ab): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.122 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.129 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.131 187082 INFO os_vif [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:b9:67,bridge_name='br-int',has_traffic_filtering=True,id=37bcaf75-ab39-43e9-8516-af26ec1703a2,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcaf75-ab')
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.173 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.173 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.174 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] No VIF found with MAC fa:16:3e:5a:b9:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.174 187082 INFO nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Using config drive
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.620 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.784 187082 INFO nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Creating config drive at /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk.config
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.789 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn0w4d8ek execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.915 187082 DEBUG oslo_concurrency.processutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn0w4d8ek" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:14 compute-1 kernel: tap37bcaf75-ab: entered promiscuous mode
Nov 24 13:44:14 compute-1 ovn_controller[95368]: 2025-11-24T13:44:14Z|00221|binding|INFO|Claiming lport 37bcaf75-ab39-43e9-8516-af26ec1703a2 for this chassis.
Nov 24 13:44:14 compute-1 ovn_controller[95368]: 2025-11-24T13:44:14Z|00222|binding|INFO|37bcaf75-ab39-43e9-8516-af26ec1703a2: Claiming fa:16:3e:5a:b9:67 10.100.0.12
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.982 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:14 compute-1 NetworkManager[55527]: <info>  [1763991854.9824] manager: (tap37bcaf75-ab): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Nov 24 13:44:14 compute-1 nova_compute[187078]: 2025-11-24 13:44:14.985 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.001 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:b9:67 10.100.0.12'], port_security=['fa:16:3e:5a:b9:67 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0c1bcc72-9195-4f95-9651-6398445db523', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e2ed81c-d809-4eac-aa1d-550e25261754', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2784a47a2a14067b899db14fb90fc19', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd261cbc3-3314-493c-9f86-e0d57ec0fbbd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40e69ba9-1340-4130-a1e7-4a69be27bd25, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=37bcaf75-ab39-43e9-8516-af26ec1703a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.002 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 37bcaf75-ab39-43e9-8516-af26ec1703a2 in datapath 9e2ed81c-d809-4eac-aa1d-550e25261754 bound to our chassis
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.004 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9e2ed81c-d809-4eac-aa1d-550e25261754
Nov 24 13:44:15 compute-1 systemd-udevd[218732]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:44:15 compute-1 systemd-machined[153355]: New machine qemu-19-instance-0000001b.
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.014 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b00bea6e-e997-4c5d-8c28-136854d7998e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.015 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9e2ed81c-d1 in ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.018 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9e2ed81c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.018 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec285a4-b503-4f6e-9602-aace93871357]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.019 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb206f7-2399-408e-b385-609912619b74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 NetworkManager[55527]: <info>  [1763991855.0224] device (tap37bcaf75-ab): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:44:15 compute-1 NetworkManager[55527]: <info>  [1763991855.0232] device (tap37bcaf75-ab): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.033 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[a87dde0d-2f23-4035-8365-4a89bb0bb177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-0000001b.
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.054 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:15 compute-1 ovn_controller[95368]: 2025-11-24T13:44:15Z|00223|binding|INFO|Setting lport 37bcaf75-ab39-43e9-8516-af26ec1703a2 ovn-installed in OVS
Nov 24 13:44:15 compute-1 ovn_controller[95368]: 2025-11-24T13:44:15Z|00224|binding|INFO|Setting lport 37bcaf75-ab39-43e9-8516-af26ec1703a2 up in Southbound
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.061 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.064 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a33d2d8b-f95a-4467-b70d-3d7f213ecfb5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.092 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[3f223481-c6fd-424c-a36e-62bf524e79f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.096 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[3991329f-1ff3-4075-9a2a-8735f451ea41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 NetworkManager[55527]: <info>  [1763991855.0979] manager: (tap9e2ed81c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.129 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[82c52927-c37e-405d-9e1b-f92e19321a5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.133 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[02f1d7bc-2f77-47a0-bd4f-e245b9a1e9be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 NetworkManager[55527]: <info>  [1763991855.1531] device (tap9e2ed81c-d0): carrier: link connected
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.156 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[1a8cd987-eda6-45b4-8d39-57de5c0b08d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.172 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[50036536-350d-4ce7-8082-dd9f67f828c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e2ed81c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:dc:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487381, 'reachable_time': 34141, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218765, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.184 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[212040b8-7368-464b-ae63-2670a84f0a59]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:dc96'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487381, 'tstamp': 487381}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218766, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.201 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a17fb9a5-8d7b-40da-82eb-b8f8ac2e0327]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e2ed81c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:dc:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487381, 'reachable_time': 34141, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218767, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.222 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[34060a3d-c756-4ad7-a14c-1f071cd9071d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.277 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[55accd4a-9399-4a05-aaff-c54807374127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.278 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e2ed81c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.279 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.279 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e2ed81c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.281 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:15 compute-1 kernel: tap9e2ed81c-d0: entered promiscuous mode
Nov 24 13:44:15 compute-1 NetworkManager[55527]: <info>  [1763991855.2834] manager: (tap9e2ed81c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.284 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9e2ed81c-d0, col_values=(('external_ids', {'iface-id': '94017d14-ef7d-428f-bd1a-453f02753f15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.285 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:15 compute-1 ovn_controller[95368]: 2025-11-24T13:44:15Z|00225|binding|INFO|Releasing lport 94017d14-ef7d-428f-bd1a-453f02753f15 from this chassis (sb_readonly=0)
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.286 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.287 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9e2ed81c-d809-4eac-aa1d-550e25261754.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9e2ed81c-d809-4eac-aa1d-550e25261754.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.287 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[2578233f-3c09-4b7e-a77b-122beb8c5385]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.288 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-9e2ed81c-d809-4eac-aa1d-550e25261754
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/9e2ed81c-d809-4eac-aa1d-550e25261754.pid.haproxy
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID 9e2ed81c-d809-4eac-aa1d-550e25261754
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:44:15 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:15.289 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'env', 'PROCESS_TAG=haproxy-9e2ed81c-d809-4eac-aa1d-550e25261754', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9e2ed81c-d809-4eac-aa1d-550e25261754.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.296 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.555 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991855.5545497, 0c1bcc72-9195-4f95-9651-6398445db523 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.555 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] VM Started (Lifecycle Event)
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.577 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.581 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991855.5546823, 0c1bcc72-9195-4f95-9651-6398445db523 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.581 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] VM Paused (Lifecycle Event)
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.599 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.601 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:44:15 compute-1 podman[218806]: 2025-11-24 13:44:15.614609006 +0000 UTC m=+0.042373276 container create e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.626 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:44:15 compute-1 systemd[1]: Started libpod-conmon-e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a.scope.
Nov 24 13:44:15 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:44:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9aa6918e132ee966bfa822bc6d0a6c9610a276017e470a59c50767911d208f31/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:44:15 compute-1 podman[218806]: 2025-11-24 13:44:15.594110422 +0000 UTC m=+0.021874712 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:44:15 compute-1 podman[218806]: 2025-11-24 13:44:15.704639281 +0000 UTC m=+0.132403551 container init e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 13:44:15 compute-1 podman[218806]: 2025-11-24 13:44:15.710223582 +0000 UTC m=+0.137987852 container start e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 13:44:15 compute-1 neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754[218821]: [NOTICE]   (218826) : New worker (218828) forked
Nov 24 13:44:15 compute-1 neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754[218821]: [NOTICE]   (218826) : Loading success.
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.741 187082 DEBUG nova.compute.manager [req-17d060fd-f6a0-4c89-8071-ede98ae06137 req-daa0da41-8562-47a0-81d1-ec5d758f3bb2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received event network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.742 187082 DEBUG oslo_concurrency.lockutils [req-17d060fd-f6a0-4c89-8071-ede98ae06137 req-daa0da41-8562-47a0-81d1-ec5d758f3bb2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "0c1bcc72-9195-4f95-9651-6398445db523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.742 187082 DEBUG oslo_concurrency.lockutils [req-17d060fd-f6a0-4c89-8071-ede98ae06137 req-daa0da41-8562-47a0-81d1-ec5d758f3bb2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.743 187082 DEBUG oslo_concurrency.lockutils [req-17d060fd-f6a0-4c89-8071-ede98ae06137 req-daa0da41-8562-47a0-81d1-ec5d758f3bb2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.743 187082 DEBUG nova.compute.manager [req-17d060fd-f6a0-4c89-8071-ede98ae06137 req-daa0da41-8562-47a0-81d1-ec5d758f3bb2 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Processing event network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.744 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.747 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991855.7469146, 0c1bcc72-9195-4f95-9651-6398445db523 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.747 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] VM Resumed (Lifecycle Event)
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.749 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.752 187082 INFO nova.virt.libvirt.driver [-] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Instance spawned successfully.
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.753 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.768 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.773 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.776 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.776 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.777 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.777 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.777 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.778 187082 DEBUG nova.virt.libvirt.driver [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.800 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.832 187082 INFO nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Took 6.51 seconds to spawn the instance on the hypervisor.
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.833 187082 DEBUG nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.886 187082 INFO nova.compute.manager [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Took 6.93 seconds to build instance.
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.897 187082 DEBUG oslo_concurrency.lockutils [None req-7aeb8b26-cda3-49c7-9985-6340c08d227c edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.989s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.933 187082 DEBUG nova.network.neutron [req-4470427f-ab08-4449-8e8f-60cdc5fd579b req-8d9337bf-c01b-44af-af4b-5fd0d8acf454 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Updated VIF entry in instance network info cache for port 37bcaf75-ab39-43e9-8516-af26ec1703a2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.933 187082 DEBUG nova.network.neutron [req-4470427f-ab08-4449-8e8f-60cdc5fd579b req-8d9337bf-c01b-44af-af4b-5fd0d8acf454 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Updating instance_info_cache with network_info: [{"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:44:15 compute-1 nova_compute[187078]: 2025-11-24 13:44:15.943 187082 DEBUG oslo_concurrency.lockutils [req-4470427f-ab08-4449-8e8f-60cdc5fd579b req-8d9337bf-c01b-44af-af4b-5fd0d8acf454 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:44:17 compute-1 nova_compute[187078]: 2025-11-24 13:44:17.813 187082 DEBUG nova.compute.manager [req-c86a2fa7-6904-4440-a8bd-64ca133af258 req-75a61a6b-a91f-4e51-87b7-1efd997dca5c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received event network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:44:17 compute-1 nova_compute[187078]: 2025-11-24 13:44:17.814 187082 DEBUG oslo_concurrency.lockutils [req-c86a2fa7-6904-4440-a8bd-64ca133af258 req-75a61a6b-a91f-4e51-87b7-1efd997dca5c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "0c1bcc72-9195-4f95-9651-6398445db523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:17 compute-1 nova_compute[187078]: 2025-11-24 13:44:17.815 187082 DEBUG oslo_concurrency.lockutils [req-c86a2fa7-6904-4440-a8bd-64ca133af258 req-75a61a6b-a91f-4e51-87b7-1efd997dca5c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:17 compute-1 nova_compute[187078]: 2025-11-24 13:44:17.816 187082 DEBUG oslo_concurrency.lockutils [req-c86a2fa7-6904-4440-a8bd-64ca133af258 req-75a61a6b-a91f-4e51-87b7-1efd997dca5c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:17 compute-1 nova_compute[187078]: 2025-11-24 13:44:17.816 187082 DEBUG nova.compute.manager [req-c86a2fa7-6904-4440-a8bd-64ca133af258 req-75a61a6b-a91f-4e51-87b7-1efd997dca5c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] No waiting events found dispatching network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:44:17 compute-1 nova_compute[187078]: 2025-11-24 13:44:17.817 187082 WARNING nova.compute.manager [req-c86a2fa7-6904-4440-a8bd-64ca133af258 req-75a61a6b-a91f-4e51-87b7-1efd997dca5c 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received unexpected event network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 for instance with vm_state active and task_state None.
Nov 24 13:44:19 compute-1 nova_compute[187078]: 2025-11-24 13:44:19.121 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:19 compute-1 sshd-session[218837]: Invalid user sol from 193.32.162.146 port 46260
Nov 24 13:44:19 compute-1 sshd-session[218837]: Connection closed by invalid user sol 193.32.162.146 port 46260 [preauth]
Nov 24 13:44:19 compute-1 openstack_network_exporter[199599]: ERROR   13:44:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:44:19 compute-1 openstack_network_exporter[199599]: ERROR   13:44:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:44:19 compute-1 openstack_network_exporter[199599]: ERROR   13:44:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:44:19 compute-1 openstack_network_exporter[199599]: ERROR   13:44:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:44:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:44:19 compute-1 openstack_network_exporter[199599]: ERROR   13:44:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:44:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:44:19 compute-1 nova_compute[187078]: 2025-11-24 13:44:19.621 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:20 compute-1 podman[218839]: 2025-11-24 13:44:20.524512096 +0000 UTC m=+0.062189072 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Nov 24 13:44:23 compute-1 nova_compute[187078]: 2025-11-24 13:44:23.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:23 compute-1 nova_compute[187078]: 2025-11-24 13:44:23.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:24 compute-1 nova_compute[187078]: 2025-11-24 13:44:24.124 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:24 compute-1 nova_compute[187078]: 2025-11-24 13:44:24.624 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:26 compute-1 sshd-session[218861]: Invalid user elemental from 175.100.24.139 port 52742
Nov 24 13:44:26 compute-1 nova_compute[187078]: 2025-11-24 13:44:26.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:26 compute-1 nova_compute[187078]: 2025-11-24 13:44:26.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:26 compute-1 sshd-session[218861]: Received disconnect from 175.100.24.139 port 52742:11: Bye Bye [preauth]
Nov 24 13:44:26 compute-1 sshd-session[218861]: Disconnected from invalid user elemental 175.100.24.139 port 52742 [preauth]
Nov 24 13:44:28 compute-1 nova_compute[187078]: 2025-11-24 13:44:28.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:29 compute-1 nova_compute[187078]: 2025-11-24 13:44:29.126 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:29 compute-1 ovn_controller[95368]: 2025-11-24T13:44:29Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5a:b9:67 10.100.0.12
Nov 24 13:44:29 compute-1 ovn_controller[95368]: 2025-11-24T13:44:29Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5a:b9:67 10.100.0.12
Nov 24 13:44:29 compute-1 nova_compute[187078]: 2025-11-24 13:44:29.625 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.693 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.759 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.832 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.833 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:31 compute-1 nova_compute[187078]: 2025-11-24 13:44:31.892 187082 DEBUG oslo_concurrency.processutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.042 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.043 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.4261474609375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.044 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.044 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.108 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Instance 0c1bcc72-9195-4f95-9651-6398445db523 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.108 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.109 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.122 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.138 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.138 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.151 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.168 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.202 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.216 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.232 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:44:32 compute-1 nova_compute[187078]: 2025-11-24 13:44:32.233 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:33 compute-1 nova_compute[187078]: 2025-11-24 13:44:33.232 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:33 compute-1 nova_compute[187078]: 2025-11-24 13:44:33.232 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:44:33 compute-1 nova_compute[187078]: 2025-11-24 13:44:33.233 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:44:33 compute-1 nova_compute[187078]: 2025-11-24 13:44:33.602 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:44:33 compute-1 nova_compute[187078]: 2025-11-24 13:44:33.602 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquired lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:44:33 compute-1 nova_compute[187078]: 2025-11-24 13:44:33.603 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 13:44:33 compute-1 nova_compute[187078]: 2025-11-24 13:44:33.603 187082 DEBUG nova.objects.instance [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0c1bcc72-9195-4f95-9651-6398445db523 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:44:34 compute-1 nova_compute[187078]: 2025-11-24 13:44:34.130 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:34 compute-1 nova_compute[187078]: 2025-11-24 13:44:34.629 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:34 compute-1 nova_compute[187078]: 2025-11-24 13:44:34.767 187082 DEBUG nova.network.neutron [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Updating instance_info_cache with network_info: [{"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:44:34 compute-1 nova_compute[187078]: 2025-11-24 13:44:34.784 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Releasing lock "refresh_cache-0c1bcc72-9195-4f95-9651-6398445db523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:44:34 compute-1 nova_compute[187078]: 2025-11-24 13:44:34.784 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 13:44:35 compute-1 podman[218882]: 2025-11-24 13:44:35.524119744 +0000 UTC m=+0.059947891 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:44:35 compute-1 podman[218883]: 2025-11-24 13:44:35.55536371 +0000 UTC m=+0.087387685 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 13:44:35 compute-1 nova_compute[187078]: 2025-11-24 13:44:35.564 187082 DEBUG nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Creating tmpfile /var/lib/nova/instances/tmpc_ov3akn to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 24 13:44:35 compute-1 nova_compute[187078]: 2025-11-24 13:44:35.565 187082 DEBUG nova.compute.manager [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc_ov3akn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 24 13:44:35 compute-1 podman[197429]: time="2025-11-24T13:44:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:44:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:44:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 24 13:44:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:44:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3066 "" "Go-http-client/1.1"
Nov 24 13:44:37 compute-1 nova_compute[187078]: 2025-11-24 13:44:37.972 187082 DEBUG nova.compute.manager [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc_ov3akn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fc5f8dd8-0047-4086-91fe-4976b3c4eef4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 24 13:44:37 compute-1 nova_compute[187078]: 2025-11-24 13:44:37.987 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquiring lock "refresh_cache-fc5f8dd8-0047-4086-91fe-4976b3c4eef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:44:37 compute-1 nova_compute[187078]: 2025-11-24 13:44:37.987 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquired lock "refresh_cache-fc5f8dd8-0047-4086-91fe-4976b3c4eef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:44:37 compute-1 nova_compute[187078]: 2025-11-24 13:44:37.987 187082 DEBUG nova.network.neutron [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:44:38 compute-1 nova_compute[187078]: 2025-11-24 13:44:38.211 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:39 compute-1 nova_compute[187078]: 2025-11-24 13:44:39.133 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:39 compute-1 nova_compute[187078]: 2025-11-24 13:44:39.629 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:39 compute-1 nova_compute[187078]: 2025-11-24 13:44:39.659 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.006 187082 DEBUG nova.network.neutron [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Updating instance_info_cache with network_info: [{"id": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "address": "fa:16:3e:c4:0f:c3", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c2e5c7-ad", "ovs_interfaceid": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.025 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Releasing lock "refresh_cache-fc5f8dd8-0047-4086-91fe-4976b3c4eef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.026 187082 DEBUG nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc_ov3akn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fc5f8dd8-0047-4086-91fe-4976b3c4eef4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.027 187082 DEBUG nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Creating instance directory: /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.027 187082 DEBUG nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Creating disk.info with the contents: {'/var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk': 'qcow2', '/var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.027 187082 DEBUG nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.028 187082 DEBUG nova.objects.instance [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lazy-loading 'trusted_certs' on Instance uuid fc5f8dd8-0047-4086-91fe-4976b3c4eef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.048 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.100 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.101 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.102 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.113 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.165 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.166 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.196 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.197 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.197 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.250 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.251 187082 DEBUG nova.virt.disk.api [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Checking if we can resize image /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.251 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.302 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.303 187082 DEBUG nova.virt.disk.api [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Cannot resize image /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.303 187082 DEBUG nova.objects.instance [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lazy-loading 'migration_context' on Instance uuid fc5f8dd8-0047-4086-91fe-4976b3c4eef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.314 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.334 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.335 187082 DEBUG nova.virt.libvirt.volume.remotefs [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk.config to /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.336 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk.config /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:44:40 compute-1 podman[218945]: 2025-11-24 13:44:40.503635456 +0000 UTC m=+0.052170762 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 24 13:44:40 compute-1 podman[218946]: 2025-11-24 13:44:40.535514777 +0000 UTC m=+0.076968981 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.879 187082 DEBUG oslo_concurrency.processutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4/disk.config /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4" returned: 0 in 0.543s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.880 187082 DEBUG nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.881 187082 DEBUG nova.virt.libvirt.vif [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:44:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-41199338',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-41199338',id=28,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:44:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c2784a47a2a14067b899db14fb90fc19',ramdisk_id='',reservation_id='r-krivof60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:44:29Z,user_data=None,user_id='edfedf0b61034f078802fdb1dc050c75',uuid=fc5f8dd8-0047-4086-91fe-4976b3c4eef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "address": "fa:16:3e:c4:0f:c3", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc5c2e5c7-ad", "ovs_interfaceid": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.881 187082 DEBUG nova.network.os_vif_util [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Converting VIF {"id": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "address": "fa:16:3e:c4:0f:c3", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc5c2e5c7-ad", "ovs_interfaceid": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.882 187082 DEBUG nova.network.os_vif_util [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:0f:c3,bridge_name='br-int',has_traffic_filtering=True,id=c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c2e5c7-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.882 187082 DEBUG os_vif [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:0f:c3,bridge_name='br-int',has_traffic_filtering=True,id=c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c2e5c7-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.883 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.883 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.884 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.885 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.886 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5c2e5c7-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.886 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5c2e5c7-ad, col_values=(('external_ids', {'iface-id': 'c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:0f:c3', 'vm-uuid': 'fc5f8dd8-0047-4086-91fe-4976b3c4eef4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.887 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:40 compute-1 NetworkManager[55527]: <info>  [1763991880.8883] manager: (tapc5c2e5c7-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.890 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.893 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.893 187082 INFO os_vif [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:0f:c3,bridge_name='br-int',has_traffic_filtering=True,id=c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c2e5c7-ad')
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.894 187082 DEBUG nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 24 13:44:40 compute-1 nova_compute[187078]: 2025-11-24 13:44:40.894 187082 DEBUG nova.compute.manager [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc_ov3akn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fc5f8dd8-0047-4086-91fe-4976b3c4eef4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 24 13:44:42 compute-1 nova_compute[187078]: 2025-11-24 13:44:42.175 187082 DEBUG nova.network.neutron [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Port c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 24 13:44:42 compute-1 nova_compute[187078]: 2025-11-24 13:44:42.177 187082 DEBUG nova.compute.manager [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc_ov3akn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fc5f8dd8-0047-4086-91fe-4976b3c4eef4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 24 13:44:42 compute-1 systemd[1]: Starting libvirt proxy daemon...
Nov 24 13:44:42 compute-1 systemd[1]: Started libvirt proxy daemon.
Nov 24 13:44:42 compute-1 kernel: tapc5c2e5c7-ad: entered promiscuous mode
Nov 24 13:44:42 compute-1 NetworkManager[55527]: <info>  [1763991882.4773] manager: (tapc5c2e5c7-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Nov 24 13:44:42 compute-1 ovn_controller[95368]: 2025-11-24T13:44:42Z|00226|binding|INFO|Claiming lport c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 for this additional chassis.
Nov 24 13:44:42 compute-1 ovn_controller[95368]: 2025-11-24T13:44:42Z|00227|binding|INFO|c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872: Claiming fa:16:3e:c4:0f:c3 10.100.0.7
Nov 24 13:44:42 compute-1 nova_compute[187078]: 2025-11-24 13:44:42.477 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:42 compute-1 ovn_controller[95368]: 2025-11-24T13:44:42Z|00228|binding|INFO|Setting lport c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 ovn-installed in OVS
Nov 24 13:44:42 compute-1 nova_compute[187078]: 2025-11-24 13:44:42.492 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:42 compute-1 nova_compute[187078]: 2025-11-24 13:44:42.498 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:42 compute-1 systemd-udevd[219026]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:44:42 compute-1 systemd-machined[153355]: New machine qemu-20-instance-0000001c.
Nov 24 13:44:42 compute-1 NetworkManager[55527]: <info>  [1763991882.5253] device (tapc5c2e5c7-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:44:42 compute-1 NetworkManager[55527]: <info>  [1763991882.5260] device (tapc5c2e5c7-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:44:42 compute-1 systemd[1]: Started Virtual Machine qemu-20-instance-0000001c.
Nov 24 13:44:43 compute-1 nova_compute[187078]: 2025-11-24 13:44:43.043 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991883.0431068, fc5f8dd8-0047-4086-91fe-4976b3c4eef4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:44:43 compute-1 nova_compute[187078]: 2025-11-24 13:44:43.044 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] VM Started (Lifecycle Event)
Nov 24 13:44:43 compute-1 nova_compute[187078]: 2025-11-24 13:44:43.064 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:44:43 compute-1 nova_compute[187078]: 2025-11-24 13:44:43.786 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763991883.78589, fc5f8dd8-0047-4086-91fe-4976b3c4eef4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:44:43 compute-1 nova_compute[187078]: 2025-11-24 13:44:43.787 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] VM Resumed (Lifecycle Event)
Nov 24 13:44:43 compute-1 nova_compute[187078]: 2025-11-24 13:44:43.805 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:44:43 compute-1 nova_compute[187078]: 2025-11-24 13:44:43.809 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:44:43 compute-1 nova_compute[187078]: 2025-11-24 13:44:43.829 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.277 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:44:44 compute-1 nova_compute[187078]: 2025-11-24 13:44:44.278 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.279 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:44:44 compute-1 ovn_controller[95368]: 2025-11-24T13:44:44Z|00229|binding|INFO|Claiming lport c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 for this chassis.
Nov 24 13:44:44 compute-1 ovn_controller[95368]: 2025-11-24T13:44:44Z|00230|binding|INFO|c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872: Claiming fa:16:3e:c4:0f:c3 10.100.0.7
Nov 24 13:44:44 compute-1 ovn_controller[95368]: 2025-11-24T13:44:44Z|00231|binding|INFO|Setting lport c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 up in Southbound
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.601 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:0f:c3 10.100.0.7'], port_security=['fa:16:3e:c4:0f:c3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fc5f8dd8-0047-4086-91fe-4976b3c4eef4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e2ed81c-d809-4eac-aa1d-550e25261754', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2784a47a2a14067b899db14fb90fc19', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'd261cbc3-3314-493c-9f86-e0d57ec0fbbd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40e69ba9-1340-4130-a1e7-4a69be27bd25, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.603 104225 INFO neutron.agent.ovn.metadata.agent [-] Port c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 in datapath 9e2ed81c-d809-4eac-aa1d-550e25261754 bound to our chassis
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.606 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9e2ed81c-d809-4eac-aa1d-550e25261754
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.626 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[73daa4d7-70a8-43b6-b184-190c2e48832a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:44 compute-1 nova_compute[187078]: 2025-11-24 13:44:44.631 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.666 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[c88ae9d4-5019-4bd1-aa97-86458a8257be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.673 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[99826756-d3e3-4e18-b5f6-f9401ac81085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.712 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1b7cff-2ae6-4a1c-a808-f93576f6ddb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.741 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[de6b39bb-7940-42c5-9e2c-9a54ea41c053]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e2ed81c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:dc:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487381, 'reachable_time': 34141, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219059, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:44 compute-1 nova_compute[187078]: 2025-11-24 13:44:44.761 187082 INFO nova.compute.manager [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Post operation of migration started
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.766 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[485f7c9b-168d-4848-b64f-c8f305a533c7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9e2ed81c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487390, 'tstamp': 487390}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219060, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9e2ed81c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487393, 'tstamp': 487393}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219060, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.769 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e2ed81c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:44 compute-1 nova_compute[187078]: 2025-11-24 13:44:44.770 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:44 compute-1 nova_compute[187078]: 2025-11-24 13:44:44.771 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.772 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e2ed81c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.772 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.773 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9e2ed81c-d0, col_values=(('external_ids', {'iface-id': '94017d14-ef7d-428f-bd1a-453f02753f15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:44 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:44.774 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.003 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquiring lock "refresh_cache-fc5f8dd8-0047-4086-91fe-4976b3c4eef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.004 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquired lock "refresh_cache-fc5f8dd8-0047-4086-91fe-4976b3c4eef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.004 187082 DEBUG nova.network.neutron [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:44:45 compute-1 sshd-session[218991]: Invalid user zimbra from 45.78.194.40 port 35742
Nov 24 13:44:45 compute-1 sshd-session[218991]: Received disconnect from 45.78.194.40 port 35742:11: Bye Bye [preauth]
Nov 24 13:44:45 compute-1 sshd-session[218991]: Disconnected from invalid user zimbra 45.78.194.40 port 35742 [preauth]
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.889 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.942 187082 DEBUG nova.network.neutron [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Updating instance_info_cache with network_info: [{"id": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "address": "fa:16:3e:c4:0f:c3", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c2e5c7-ad", "ovs_interfaceid": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.965 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Releasing lock "refresh_cache-fc5f8dd8-0047-4086-91fe-4976b3c4eef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.982 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.982 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.983 187082 DEBUG oslo_concurrency.lockutils [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:45 compute-1 nova_compute[187078]: 2025-11-24 13:44:45.988 187082 INFO nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 24 13:44:45 compute-1 virtqemud[186628]: Domain id=20 name='instance-0000001c' uuid=fc5f8dd8-0047-4086-91fe-4976b3c4eef4 is tainted: custom-monitor
Nov 24 13:44:46 compute-1 nova_compute[187078]: 2025-11-24 13:44:46.998 187082 INFO nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 24 13:44:47 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:47.282 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:48 compute-1 nova_compute[187078]: 2025-11-24 13:44:48.005 187082 INFO nova.virt.libvirt.driver [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 24 13:44:48 compute-1 nova_compute[187078]: 2025-11-24 13:44:48.011 187082 DEBUG nova.compute.manager [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:44:48 compute-1 nova_compute[187078]: 2025-11-24 13:44:48.033 187082 DEBUG nova.objects.instance [None req-4236daf6-8843-41eb-9fae-be3eeeb23c69 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 24 13:44:49 compute-1 openstack_network_exporter[199599]: ERROR   13:44:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:44:49 compute-1 openstack_network_exporter[199599]: ERROR   13:44:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:44:49 compute-1 openstack_network_exporter[199599]: ERROR   13:44:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:44:49 compute-1 openstack_network_exporter[199599]: ERROR   13:44:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:44:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:44:49 compute-1 openstack_network_exporter[199599]: ERROR   13:44:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:44:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:44:49 compute-1 nova_compute[187078]: 2025-11-24 13:44:49.633 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:50 compute-1 nova_compute[187078]: 2025-11-24 13:44:50.891 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:51 compute-1 podman[219061]: 2025-11-24 13:44:51.578437633 +0000 UTC m=+0.099267885 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 13:44:54 compute-1 nova_compute[187078]: 2025-11-24 13:44:54.636 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:55 compute-1 nova_compute[187078]: 2025-11-24 13:44:55.895 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.613 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.613 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.614 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.614 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.614 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.615 187082 INFO nova.compute.manager [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Terminating instance
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.616 187082 DEBUG nova.compute.manager [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:44:59 compute-1 kernel: tapc5c2e5c7-ad (unregistering): left promiscuous mode
Nov 24 13:44:59 compute-1 NetworkManager[55527]: <info>  [1763991899.6400] device (tapc5c2e5c7-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.640 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.649 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 ovn_controller[95368]: 2025-11-24T13:44:59Z|00232|binding|INFO|Releasing lport c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 from this chassis (sb_readonly=0)
Nov 24 13:44:59 compute-1 ovn_controller[95368]: 2025-11-24T13:44:59Z|00233|binding|INFO|Setting lport c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 down in Southbound
Nov 24 13:44:59 compute-1 ovn_controller[95368]: 2025-11-24T13:44:59Z|00234|binding|INFO|Removing iface tapc5c2e5c7-ad ovn-installed in OVS
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.652 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.656 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:0f:c3 10.100.0.7'], port_security=['fa:16:3e:c4:0f:c3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fc5f8dd8-0047-4086-91fe-4976b3c4eef4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e2ed81c-d809-4eac-aa1d-550e25261754', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2784a47a2a14067b899db14fb90fc19', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'd261cbc3-3314-493c-9f86-e0d57ec0fbbd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40e69ba9-1340-4130-a1e7-4a69be27bd25, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.657 104225 INFO neutron.agent.ovn.metadata.agent [-] Port c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 in datapath 9e2ed81c-d809-4eac-aa1d-550e25261754 unbound from our chassis
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.659 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9e2ed81c-d809-4eac-aa1d-550e25261754
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.664 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.677 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0d6446d3-e5c8-4dbd-bf8c-f3078c593648]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:59 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Nov 24 13:44:59 compute-1 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001c.scope: Consumed 1.798s CPU time.
Nov 24 13:44:59 compute-1 systemd-machined[153355]: Machine qemu-20-instance-0000001c terminated.
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.703 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[7c928421-ddc7-44f8-9ee8-b5eeac264005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.706 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad97ca2-854b-4614-af47-8fbffa8b6dda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.739 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[67d194c9-46d4-4217-a95a-c201269aec7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.758 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[95c217e2-b27e-403d-9f9d-68fa3d588bc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e2ed81c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:dc:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 16, 'tx_packets': 7, 'rx_bytes': 952, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 16, 'tx_packets': 7, 'rx_bytes': 952, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487381, 'reachable_time': 34141, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219095, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.784 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[c39bb9b4-35c6-4cf7-9ffd-0c5883342595]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9e2ed81c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487390, 'tstamp': 487390}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219096, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9e2ed81c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487393, 'tstamp': 487393}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219096, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.785 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e2ed81c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.785 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.789 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.789 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e2ed81c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.790 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.790 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9e2ed81c-d0, col_values=(('external_ids', {'iface-id': '94017d14-ef7d-428f-bd1a-453f02753f15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:44:59.790 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.830 187082 DEBUG nova.compute.manager [req-674740dd-2e9b-4c8f-9fca-dc605df15ae7 req-04852eb5-fa32-4651-8073-5a12d5bda2ce 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Received event network-vif-unplugged-c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.831 187082 DEBUG oslo_concurrency.lockutils [req-674740dd-2e9b-4c8f-9fca-dc605df15ae7 req-04852eb5-fa32-4651-8073-5a12d5bda2ce 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.831 187082 DEBUG oslo_concurrency.lockutils [req-674740dd-2e9b-4c8f-9fca-dc605df15ae7 req-04852eb5-fa32-4651-8073-5a12d5bda2ce 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.831 187082 DEBUG oslo_concurrency.lockutils [req-674740dd-2e9b-4c8f-9fca-dc605df15ae7 req-04852eb5-fa32-4651-8073-5a12d5bda2ce 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.831 187082 DEBUG nova.compute.manager [req-674740dd-2e9b-4c8f-9fca-dc605df15ae7 req-04852eb5-fa32-4651-8073-5a12d5bda2ce 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] No waiting events found dispatching network-vif-unplugged-c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.832 187082 DEBUG nova.compute.manager [req-674740dd-2e9b-4c8f-9fca-dc605df15ae7 req-04852eb5-fa32-4651-8073-5a12d5bda2ce 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Received event network-vif-unplugged-c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.843 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.847 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.883 187082 INFO nova.virt.libvirt.driver [-] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Instance destroyed successfully.
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.884 187082 DEBUG nova.objects.instance [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lazy-loading 'resources' on Instance uuid fc5f8dd8-0047-4086-91fe-4976b3c4eef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.894 187082 DEBUG nova.virt.libvirt.vif [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-24T13:44:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-41199338',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-41199338',id=28,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:44:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2784a47a2a14067b899db14fb90fc19',ramdisk_id='',reservation_id='r-krivof60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:44:48Z,user_data=None,user_id='edfedf0b61034f078802fdb1dc050c75',uuid=fc5f8dd8-0047-4086-91fe-4976b3c4eef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "address": "fa:16:3e:c4:0f:c3", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c2e5c7-ad", "ovs_interfaceid": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.895 187082 DEBUG nova.network.os_vif_util [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Converting VIF {"id": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "address": "fa:16:3e:c4:0f:c3", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c2e5c7-ad", "ovs_interfaceid": "c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.895 187082 DEBUG nova.network.os_vif_util [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:0f:c3,bridge_name='br-int',has_traffic_filtering=True,id=c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c2e5c7-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.896 187082 DEBUG os_vif [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:0f:c3,bridge_name='br-int',has_traffic_filtering=True,id=c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c2e5c7-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.899 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.899 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5c2e5c7-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.901 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.902 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.904 187082 INFO os_vif [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:0f:c3,bridge_name='br-int',has_traffic_filtering=True,id=c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c2e5c7-ad')
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.905 187082 INFO nova.virt.libvirt.driver [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Deleting instance files /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4_del
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.905 187082 INFO nova.virt.libvirt.driver [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Deletion of /var/lib/nova/instances/fc5f8dd8-0047-4086-91fe-4976b3c4eef4_del complete
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.957 187082 INFO nova.compute.manager [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.958 187082 DEBUG oslo.service.loopingcall [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.958 187082 DEBUG nova.compute.manager [-] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:44:59 compute-1 nova_compute[187078]: 2025-11-24 13:44:59.958 187082 DEBUG nova.network.neutron [-] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.069 187082 DEBUG nova.network.neutron [-] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.287 187082 INFO nova.compute.manager [-] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Took 1.33 seconds to deallocate network for instance.
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.473 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.474 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.485 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.698 187082 INFO nova.scheduler.client.report [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Deleted allocations for instance fc5f8dd8-0047-4086-91fe-4976b3c4eef4
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.713 187082 DEBUG nova.compute.manager [req-8f39a706-b43d-4d1c-b241-70edc29ef702 req-44a103f3-b055-4b5d-9f72-c9be8760f17b 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Received event network-vif-deleted-c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.764 187082 DEBUG oslo_concurrency.lockutils [None req-e860f55c-b88f-4f45-92f2-95ae89ff4615 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.889 187082 DEBUG nova.compute.manager [req-ef838012-b888-4909-be2a-59a6aa449739 req-7db285f4-6952-4fe5-92b0-0c974c272320 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Received event network-vif-plugged-c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.890 187082 DEBUG oslo_concurrency.lockutils [req-ef838012-b888-4909-be2a-59a6aa449739 req-7db285f4-6952-4fe5-92b0-0c974c272320 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.890 187082 DEBUG oslo_concurrency.lockutils [req-ef838012-b888-4909-be2a-59a6aa449739 req-7db285f4-6952-4fe5-92b0-0c974c272320 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.891 187082 DEBUG oslo_concurrency.lockutils [req-ef838012-b888-4909-be2a-59a6aa449739 req-7db285f4-6952-4fe5-92b0-0c974c272320 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "fc5f8dd8-0047-4086-91fe-4976b3c4eef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.891 187082 DEBUG nova.compute.manager [req-ef838012-b888-4909-be2a-59a6aa449739 req-7db285f4-6952-4fe5-92b0-0c974c272320 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] No waiting events found dispatching network-vif-plugged-c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:45:01 compute-1 nova_compute[187078]: 2025-11-24 13:45:01.892 187082 WARNING nova.compute.manager [req-ef838012-b888-4909-be2a-59a6aa449739 req-7db285f4-6952-4fe5-92b0-0c974c272320 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Received unexpected event network-vif-plugged-c5c2e5c7-ad2e-4ec7-90bd-00dd98d9e872 for instance with vm_state deleted and task_state None.
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.137 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "0c1bcc72-9195-4f95-9651-6398445db523" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.138 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.139 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "0c1bcc72-9195-4f95-9651-6398445db523-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.139 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.140 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.142 187082 INFO nova.compute.manager [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Terminating instance
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.144 187082 DEBUG nova.compute.manager [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:45:04 compute-1 kernel: tap37bcaf75-ab (unregistering): left promiscuous mode
Nov 24 13:45:04 compute-1 NetworkManager[55527]: <info>  [1763991904.1671] device (tap37bcaf75-ab): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.173 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.174 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:04 compute-1 ovn_controller[95368]: 2025-11-24T13:45:04Z|00235|binding|INFO|Releasing lport 37bcaf75-ab39-43e9-8516-af26ec1703a2 from this chassis (sb_readonly=0)
Nov 24 13:45:04 compute-1 ovn_controller[95368]: 2025-11-24T13:45:04Z|00236|binding|INFO|Setting lport 37bcaf75-ab39-43e9-8516-af26ec1703a2 down in Southbound
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.174 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.175 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 ovn_controller[95368]: 2025-11-24T13:45:04Z|00237|binding|INFO|Removing iface tap37bcaf75-ab ovn-installed in OVS
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.179 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.183 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:b9:67 10.100.0.12'], port_security=['fa:16:3e:5a:b9:67 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0c1bcc72-9195-4f95-9651-6398445db523', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e2ed81c-d809-4eac-aa1d-550e25261754', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2784a47a2a14067b899db14fb90fc19', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd261cbc3-3314-493c-9f86-e0d57ec0fbbd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40e69ba9-1340-4130-a1e7-4a69be27bd25, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=37bcaf75-ab39-43e9-8516-af26ec1703a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.185 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 37bcaf75-ab39-43e9-8516-af26ec1703a2 in datapath 9e2ed81c-d809-4eac-aa1d-550e25261754 unbound from our chassis
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.186 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9e2ed81c-d809-4eac-aa1d-550e25261754, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.188 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[dd226dbe-be8a-4446-b9f7-e4817d4df94c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.188 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754 namespace which is not needed anymore
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.192 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Nov 24 13:45:04 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001b.scope: Consumed 14.999s CPU time.
Nov 24 13:45:04 compute-1 systemd-machined[153355]: Machine qemu-19-instance-0000001b terminated.
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.365 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.369 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754[218821]: [NOTICE]   (218826) : haproxy version is 2.8.14-c23fe91
Nov 24 13:45:04 compute-1 neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754[218821]: [NOTICE]   (218826) : path to executable is /usr/sbin/haproxy
Nov 24 13:45:04 compute-1 neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754[218821]: [WARNING]  (218826) : Exiting Master process...
Nov 24 13:45:04 compute-1 neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754[218821]: [ALERT]    (218826) : Current worker (218828) exited with code 143 (Terminated)
Nov 24 13:45:04 compute-1 neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754[218821]: [WARNING]  (218826) : All workers exited. Exiting... (0)
Nov 24 13:45:04 compute-1 systemd[1]: libpod-e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a.scope: Deactivated successfully.
Nov 24 13:45:04 compute-1 podman[219139]: 2025-11-24 13:45:04.396995466 +0000 UTC m=+0.080486857 container died e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.407 187082 INFO nova.virt.libvirt.driver [-] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Instance destroyed successfully.
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.409 187082 DEBUG nova.objects.instance [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lazy-loading 'resources' on Instance uuid 0c1bcc72-9195-4f95-9651-6398445db523 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.419 187082 DEBUG nova.virt.libvirt.vif [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:44:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1704049702',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1704049702',id=27,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:44:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c2784a47a2a14067b899db14fb90fc19',ramdisk_id='',reservation_id='r-993a4j0q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1463119205-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:44:15Z,user_data=None,user_id='edfedf0b61034f078802fdb1dc050c75',uuid=0c1bcc72-9195-4f95-9651-6398445db523,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.420 187082 DEBUG nova.network.os_vif_util [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Converting VIF {"id": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "address": "fa:16:3e:5a:b9:67", "network": {"id": "9e2ed81c-d809-4eac-aa1d-550e25261754", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-585862450-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2784a47a2a14067b899db14fb90fc19", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37bcaf75-ab", "ovs_interfaceid": "37bcaf75-ab39-43e9-8516-af26ec1703a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.421 187082 DEBUG nova.network.os_vif_util [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b9:67,bridge_name='br-int',has_traffic_filtering=True,id=37bcaf75-ab39-43e9-8516-af26ec1703a2,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcaf75-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.421 187082 DEBUG os_vif [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b9:67,bridge_name='br-int',has_traffic_filtering=True,id=37bcaf75-ab39-43e9-8516-af26ec1703a2,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcaf75-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.424 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.424 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37bcaf75-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:45:04 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a-userdata-shm.mount: Deactivated successfully.
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.425 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-9aa6918e132ee966bfa822bc6d0a6c9610a276017e470a59c50767911d208f31-merged.mount: Deactivated successfully.
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.426 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.429 187082 INFO os_vif [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:b9:67,bridge_name='br-int',has_traffic_filtering=True,id=37bcaf75-ab39-43e9-8516-af26ec1703a2,network=Network(9e2ed81c-d809-4eac-aa1d-550e25261754),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37bcaf75-ab')
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.429 187082 INFO nova.virt.libvirt.driver [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Deleting instance files /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523_del
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.430 187082 INFO nova.virt.libvirt.driver [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Deletion of /var/lib/nova/instances/0c1bcc72-9195-4f95-9651-6398445db523_del complete
Nov 24 13:45:04 compute-1 podman[219139]: 2025-11-24 13:45:04.44116118 +0000 UTC m=+0.124652531 container cleanup e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:45:04 compute-1 systemd[1]: libpod-conmon-e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a.scope: Deactivated successfully.
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.470 187082 INFO nova.compute.manager [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.471 187082 DEBUG oslo.service.loopingcall [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.471 187082 DEBUG nova.compute.manager [-] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.471 187082 DEBUG nova.network.neutron [-] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:45:04 compute-1 podman[219183]: 2025-11-24 13:45:04.503420463 +0000 UTC m=+0.039068367 container remove e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.512 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[583bf138-b11b-485e-a718-4d67cdf08ada]: (4, ('Mon Nov 24 01:45:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754 (e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a)\ne63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a\nMon Nov 24 01:45:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754 (e63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a)\ne63f837f0faabad7a3eec9af9dedef9d4ca962e284f21fd1efc8f565dc6d469a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.515 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e01e89-1822-4dea-8be0-b5561834860d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.517 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e2ed81c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.520 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 kernel: tap9e2ed81c-d0: left promiscuous mode
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.523 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.526 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd4c0ec-58ac-4579-8d41-fd3eb282ba87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.546 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.555 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[ecdd41e3-8652-4541-aed3-cf04476babcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.557 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0432da21-1809-42c2-858d-c64a4733a728]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.578 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[80753cc2-69e6-48aa-82a5-4bf64f2a87b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487374, 'reachable_time': 37698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219198, 'error': None, 'target': 'ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.582 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9e2ed81c-d809-4eac-aa1d-550e25261754 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:45:04 compute-1 systemd[1]: run-netns-ovnmeta\x2d9e2ed81c\x2dd809\x2d4eac\x2daa1d\x2d550e25261754.mount: Deactivated successfully.
Nov 24 13:45:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:45:04.582 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[544699a6-8ed5-4fdd-9f03-0e771e90905a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.650 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.818 187082 DEBUG nova.compute.manager [req-cb8e50fc-968f-4038-8bd0-5d4d09a27be8 req-fc1e01fe-66ef-4b51-bcb8-16026cf5ce0f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received event network-vif-unplugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.818 187082 DEBUG oslo_concurrency.lockutils [req-cb8e50fc-968f-4038-8bd0-5d4d09a27be8 req-fc1e01fe-66ef-4b51-bcb8-16026cf5ce0f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "0c1bcc72-9195-4f95-9651-6398445db523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.818 187082 DEBUG oslo_concurrency.lockutils [req-cb8e50fc-968f-4038-8bd0-5d4d09a27be8 req-fc1e01fe-66ef-4b51-bcb8-16026cf5ce0f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.819 187082 DEBUG oslo_concurrency.lockutils [req-cb8e50fc-968f-4038-8bd0-5d4d09a27be8 req-fc1e01fe-66ef-4b51-bcb8-16026cf5ce0f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.819 187082 DEBUG nova.compute.manager [req-cb8e50fc-968f-4038-8bd0-5d4d09a27be8 req-fc1e01fe-66ef-4b51-bcb8-16026cf5ce0f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] No waiting events found dispatching network-vif-unplugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:45:04 compute-1 nova_compute[187078]: 2025-11-24 13:45:04.819 187082 DEBUG nova.compute.manager [req-cb8e50fc-968f-4038-8bd0-5d4d09a27be8 req-fc1e01fe-66ef-4b51-bcb8-16026cf5ce0f 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received event network-vif-unplugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.118 187082 DEBUG nova.network.neutron [-] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.134 187082 INFO nova.compute.manager [-] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Took 0.66 seconds to deallocate network for instance.
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.174 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.174 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.222 187082 DEBUG nova.compute.provider_tree [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.239 187082 DEBUG nova.scheduler.client.report [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.260 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.284 187082 INFO nova.scheduler.client.report [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Deleted allocations for instance 0c1bcc72-9195-4f95-9651-6398445db523
Nov 24 13:45:05 compute-1 nova_compute[187078]: 2025-11-24 13:45:05.369 187082 DEBUG oslo_concurrency.lockutils [None req-dd4ff79a-f309-4e62-8fe0-4912c6cb3c85 edfedf0b61034f078802fdb1dc050c75 c2784a47a2a14067b899db14fb90fc19 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:05 compute-1 podman[197429]: time="2025-11-24T13:45:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:45:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:45:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:45:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:45:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 24 13:45:06 compute-1 podman[219199]: 2025-11-24 13:45:06.565042686 +0000 UTC m=+0.097995160 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:45:06 compute-1 podman[219200]: 2025-11-24 13:45:06.569142887 +0000 UTC m=+0.098211627 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Nov 24 13:45:06 compute-1 nova_compute[187078]: 2025-11-24 13:45:06.882 187082 DEBUG nova.compute.manager [req-c8c8f710-74ff-4ceb-a7d8-76886a4f178c req-e6a5181a-4a0d-4f51-8554-7c2dc35bb761 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received event network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:45:06 compute-1 nova_compute[187078]: 2025-11-24 13:45:06.883 187082 DEBUG oslo_concurrency.lockutils [req-c8c8f710-74ff-4ceb-a7d8-76886a4f178c req-e6a5181a-4a0d-4f51-8554-7c2dc35bb761 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "0c1bcc72-9195-4f95-9651-6398445db523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:06 compute-1 nova_compute[187078]: 2025-11-24 13:45:06.884 187082 DEBUG oslo_concurrency.lockutils [req-c8c8f710-74ff-4ceb-a7d8-76886a4f178c req-e6a5181a-4a0d-4f51-8554-7c2dc35bb761 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:06 compute-1 nova_compute[187078]: 2025-11-24 13:45:06.884 187082 DEBUG oslo_concurrency.lockutils [req-c8c8f710-74ff-4ceb-a7d8-76886a4f178c req-e6a5181a-4a0d-4f51-8554-7c2dc35bb761 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "0c1bcc72-9195-4f95-9651-6398445db523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:06 compute-1 nova_compute[187078]: 2025-11-24 13:45:06.884 187082 DEBUG nova.compute.manager [req-c8c8f710-74ff-4ceb-a7d8-76886a4f178c req-e6a5181a-4a0d-4f51-8554-7c2dc35bb761 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] No waiting events found dispatching network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:45:06 compute-1 nova_compute[187078]: 2025-11-24 13:45:06.885 187082 WARNING nova.compute.manager [req-c8c8f710-74ff-4ceb-a7d8-76886a4f178c req-e6a5181a-4a0d-4f51-8554-7c2dc35bb761 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received unexpected event network-vif-plugged-37bcaf75-ab39-43e9-8516-af26ec1703a2 for instance with vm_state deleted and task_state None.
Nov 24 13:45:06 compute-1 nova_compute[187078]: 2025-11-24 13:45:06.886 187082 DEBUG nova.compute.manager [req-c8c8f710-74ff-4ceb-a7d8-76886a4f178c req-e6a5181a-4a0d-4f51-8554-7c2dc35bb761 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Received event network-vif-deleted-37bcaf75-ab39-43e9-8516-af26ec1703a2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:45:09 compute-1 nova_compute[187078]: 2025-11-24 13:45:09.427 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:09 compute-1 nova_compute[187078]: 2025-11-24 13:45:09.652 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:11 compute-1 podman[219239]: 2025-11-24 13:45:11.546337135 +0000 UTC m=+0.084932097 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:45:11 compute-1 podman[219240]: 2025-11-24 13:45:11.564101386 +0000 UTC m=+0.094836245 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 24 13:45:14 compute-1 nova_compute[187078]: 2025-11-24 13:45:14.430 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:14 compute-1 nova_compute[187078]: 2025-11-24 13:45:14.653 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:14 compute-1 nova_compute[187078]: 2025-11-24 13:45:14.882 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991899.881261, fc5f8dd8-0047-4086-91fe-4976b3c4eef4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:45:14 compute-1 nova_compute[187078]: 2025-11-24 13:45:14.882 187082 INFO nova.compute.manager [-] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] VM Stopped (Lifecycle Event)
Nov 24 13:45:14 compute-1 nova_compute[187078]: 2025-11-24 13:45:14.900 187082 DEBUG nova.compute.manager [None req-3dca8c0c-a45e-4b8f-8b78-47b2ac8dbaaf - - - - - -] [instance: fc5f8dd8-0047-4086-91fe-4976b3c4eef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:45:19 compute-1 nova_compute[187078]: 2025-11-24 13:45:19.404 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763991904.4025621, 0c1bcc72-9195-4f95-9651-6398445db523 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:45:19 compute-1 nova_compute[187078]: 2025-11-24 13:45:19.405 187082 INFO nova.compute.manager [-] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] VM Stopped (Lifecycle Event)
Nov 24 13:45:19 compute-1 openstack_network_exporter[199599]: ERROR   13:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:45:19 compute-1 openstack_network_exporter[199599]: ERROR   13:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:45:19 compute-1 openstack_network_exporter[199599]: ERROR   13:45:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:45:19 compute-1 openstack_network_exporter[199599]: ERROR   13:45:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:45:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:45:19 compute-1 openstack_network_exporter[199599]: ERROR   13:45:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:45:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:45:19 compute-1 nova_compute[187078]: 2025-11-24 13:45:19.426 187082 DEBUG nova.compute.manager [None req-ee6f4a2e-3c40-447a-8514-0f1b72df8126 - - - - - -] [instance: 0c1bcc72-9195-4f95-9651-6398445db523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:45:19 compute-1 nova_compute[187078]: 2025-11-24 13:45:19.432 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:19 compute-1 nova_compute[187078]: 2025-11-24 13:45:19.654 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:20 compute-1 nova_compute[187078]: 2025-11-24 13:45:20.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:22 compute-1 podman[219284]: 2025-11-24 13:45:22.549453324 +0000 UTC m=+0.075035370 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41)
Nov 24 13:45:23 compute-1 nova_compute[187078]: 2025-11-24 13:45:23.676 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:24 compute-1 nova_compute[187078]: 2025-11-24 13:45:24.437 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:24 compute-1 nova_compute[187078]: 2025-11-24 13:45:24.657 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:25 compute-1 nova_compute[187078]: 2025-11-24 13:45:25.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:28 compute-1 nova_compute[187078]: 2025-11-24 13:45:28.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:28 compute-1 nova_compute[187078]: 2025-11-24 13:45:28.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:29 compute-1 nova_compute[187078]: 2025-11-24 13:45:29.440 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:29 compute-1 nova_compute[187078]: 2025-11-24 13:45:29.659 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:30 compute-1 nova_compute[187078]: 2025-11-24 13:45:30.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:32 compute-1 nova_compute[187078]: 2025-11-24 13:45:32.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:32 compute-1 nova_compute[187078]: 2025-11-24 13:45:32.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.683 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.683 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.706 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.706 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.706 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.707 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.926 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.928 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5897MB free_disk=73.45534133911133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.928 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:45:33 compute-1 nova_compute[187078]: 2025-11-24 13:45:33.929 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:45:34 compute-1 nova_compute[187078]: 2025-11-24 13:45:34.025 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:45:34 compute-1 nova_compute[187078]: 2025-11-24 13:45:34.026 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:45:34 compute-1 nova_compute[187078]: 2025-11-24 13:45:34.134 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:45:34 compute-1 nova_compute[187078]: 2025-11-24 13:45:34.153 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:45:34 compute-1 nova_compute[187078]: 2025-11-24 13:45:34.174 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:45:34 compute-1 nova_compute[187078]: 2025-11-24 13:45:34.175 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:45:34 compute-1 nova_compute[187078]: 2025-11-24 13:45:34.442 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:34 compute-1 nova_compute[187078]: 2025-11-24 13:45:34.661 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:35 compute-1 ovn_controller[95368]: 2025-11-24T13:45:35Z|00238|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Nov 24 13:45:35 compute-1 podman[197429]: time="2025-11-24T13:45:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:45:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:45:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:45:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:45:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 24 13:45:37 compute-1 nova_compute[187078]: 2025-11-24 13:45:37.168 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:37 compute-1 podman[219308]: 2025-11-24 13:45:37.260760757 +0000 UTC m=+0.061225265 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:45:37 compute-1 podman[219309]: 2025-11-24 13:45:37.286591336 +0000 UTC m=+0.069868550 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 13:45:39 compute-1 nova_compute[187078]: 2025-11-24 13:45:39.478 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:39 compute-1 nova_compute[187078]: 2025-11-24 13:45:39.663 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:40 compute-1 nova_compute[187078]: 2025-11-24 13:45:40.656 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:42 compute-1 podman[219349]: 2025-11-24 13:45:42.52480038 +0000 UTC m=+0.065672886 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 13:45:42 compute-1 podman[219350]: 2025-11-24 13:45:42.587928517 +0000 UTC m=+0.129021068 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:45:42 compute-1 nova_compute[187078]: 2025-11-24 13:45:42.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:42 compute-1 nova_compute[187078]: 2025-11-24 13:45:42.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:45:42 compute-1 nova_compute[187078]: 2025-11-24 13:45:42.681 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:45:44 compute-1 nova_compute[187078]: 2025-11-24 13:45:44.482 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:44 compute-1 nova_compute[187078]: 2025-11-24 13:45:44.665 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:45 compute-1 nova_compute[187078]: 2025-11-24 13:45:45.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:45:45 compute-1 nova_compute[187078]: 2025-11-24 13:45:45.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:45:49 compute-1 openstack_network_exporter[199599]: ERROR   13:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:45:49 compute-1 openstack_network_exporter[199599]: ERROR   13:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:45:49 compute-1 openstack_network_exporter[199599]: ERROR   13:45:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:45:49 compute-1 openstack_network_exporter[199599]: ERROR   13:45:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:45:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:45:49 compute-1 openstack_network_exporter[199599]: ERROR   13:45:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:45:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:45:49 compute-1 nova_compute[187078]: 2025-11-24 13:45:49.485 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:49 compute-1 nova_compute[187078]: 2025-11-24 13:45:49.668 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:52 compute-1 sshd-session[219395]: Invalid user sol from 45.148.10.240 port 58962
Nov 24 13:45:52 compute-1 sshd-session[219395]: Connection closed by invalid user sol 45.148.10.240 port 58962 [preauth]
Nov 24 13:45:53 compute-1 podman[219397]: 2025-11-24 13:45:53.521677161 +0000 UTC m=+0.063950040 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Nov 24 13:45:54 compute-1 nova_compute[187078]: 2025-11-24 13:45:54.488 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:54 compute-1 sshd-session[219393]: Invalid user dev from 45.78.217.131 port 38938
Nov 24 13:45:54 compute-1 nova_compute[187078]: 2025-11-24 13:45:54.670 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:55 compute-1 sshd-session[219393]: Received disconnect from 45.78.217.131 port 38938:11: Bye Bye [preauth]
Nov 24 13:45:55 compute-1 sshd-session[219393]: Disconnected from invalid user dev 45.78.217.131 port 38938 [preauth]
Nov 24 13:45:59 compute-1 nova_compute[187078]: 2025-11-24 13:45:59.490 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:45:59 compute-1 nova_compute[187078]: 2025-11-24 13:45:59.671 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:01 compute-1 nova_compute[187078]: 2025-11-24 13:46:01.884 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:01.884 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:46:01 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:01.888 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:46:02 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:02.890 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:46:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:04.174 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:46:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:04.175 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:46:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:04.175 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:46:04 compute-1 nova_compute[187078]: 2025-11-24 13:46:04.494 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:04 compute-1 nova_compute[187078]: 2025-11-24 13:46:04.671 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:05 compute-1 podman[197429]: time="2025-11-24T13:46:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:46:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:46:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:46:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:46:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Nov 24 13:46:05 compute-1 sshd-session[219418]: Invalid user vpnuser from 175.100.24.139 port 54918
Nov 24 13:46:06 compute-1 sshd-session[219418]: Received disconnect from 175.100.24.139 port 54918:11: Bye Bye [preauth]
Nov 24 13:46:06 compute-1 sshd-session[219418]: Disconnected from invalid user vpnuser 175.100.24.139 port 54918 [preauth]
Nov 24 13:46:07 compute-1 podman[219420]: 2025-11-24 13:46:07.516556295 +0000 UTC m=+0.061420701 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:46:07 compute-1 podman[219421]: 2025-11-24 13:46:07.561767447 +0000 UTC m=+0.091913826 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:46:09 compute-1 nova_compute[187078]: 2025-11-24 13:46:09.496 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:09 compute-1 nova_compute[187078]: 2025-11-24 13:46:09.673 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:13 compute-1 podman[219463]: 2025-11-24 13:46:13.544349245 +0000 UTC m=+0.089749227 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:46:13 compute-1 podman[219462]: 2025-11-24 13:46:13.554478419 +0000 UTC m=+0.097140587 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 24 13:46:14 compute-1 nova_compute[187078]: 2025-11-24 13:46:14.545 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:14 compute-1 nova_compute[187078]: 2025-11-24 13:46:14.676 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:19 compute-1 openstack_network_exporter[199599]: ERROR   13:46:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:46:19 compute-1 openstack_network_exporter[199599]: ERROR   13:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:46:19 compute-1 openstack_network_exporter[199599]: ERROR   13:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:46:19 compute-1 openstack_network_exporter[199599]: ERROR   13:46:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:46:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:46:19 compute-1 openstack_network_exporter[199599]: ERROR   13:46:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:46:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:46:19 compute-1 nova_compute[187078]: 2025-11-24 13:46:19.548 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:19 compute-1 nova_compute[187078]: 2025-11-24 13:46:19.677 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:20 compute-1 ovn_controller[95368]: 2025-11-24T13:46:20Z|00239|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 24 13:46:23 compute-1 nova_compute[187078]: 2025-11-24 13:46:23.681 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:24 compute-1 podman[219508]: 2025-11-24 13:46:24.528751617 +0000 UTC m=+0.081441413 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container)
Nov 24 13:46:24 compute-1 nova_compute[187078]: 2025-11-24 13:46:24.550 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:24 compute-1 nova_compute[187078]: 2025-11-24 13:46:24.679 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:26 compute-1 nova_compute[187078]: 2025-11-24 13:46:26.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:28 compute-1 nova_compute[187078]: 2025-11-24 13:46:28.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:29 compute-1 nova_compute[187078]: 2025-11-24 13:46:29.597 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:29 compute-1 nova_compute[187078]: 2025-11-24 13:46:29.680 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:30 compute-1 nova_compute[187078]: 2025-11-24 13:46:30.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:31 compute-1 nova_compute[187078]: 2025-11-24 13:46:31.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:32 compute-1 nova_compute[187078]: 2025-11-24 13:46:32.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:32 compute-1 nova_compute[187078]: 2025-11-24 13:46:32.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:46:34 compute-1 nova_compute[187078]: 2025-11-24 13:46:34.600 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:34 compute-1 nova_compute[187078]: 2025-11-24 13:46:34.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:34 compute-1 nova_compute[187078]: 2025-11-24 13:46:34.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:46:34 compute-1 nova_compute[187078]: 2025-11-24 13:46:34.668 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:46:34 compute-1 nova_compute[187078]: 2025-11-24 13:46:34.680 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:46:34 compute-1 nova_compute[187078]: 2025-11-24 13:46:34.683 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:35 compute-1 podman[197429]: time="2025-11-24T13:46:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:46:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:46:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:46:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:46:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.719 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.720 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.721 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.721 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.907 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.908 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5887MB free_disk=73.45534133911133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.908 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:46:35 compute-1 nova_compute[187078]: 2025-11-24 13:46:35.909 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:46:36 compute-1 nova_compute[187078]: 2025-11-24 13:46:36.007 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:46:36 compute-1 nova_compute[187078]: 2025-11-24 13:46:36.007 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:46:36 compute-1 nova_compute[187078]: 2025-11-24 13:46:36.032 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:46:36 compute-1 nova_compute[187078]: 2025-11-24 13:46:36.050 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:46:36 compute-1 nova_compute[187078]: 2025-11-24 13:46:36.052 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:46:36 compute-1 nova_compute[187078]: 2025-11-24 13:46:36.052 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:46:38 compute-1 podman[219531]: 2025-11-24 13:46:38.517921429 +0000 UTC m=+0.063035394 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:46:38 compute-1 podman[219532]: 2025-11-24 13:46:38.535532015 +0000 UTC m=+0.075491042 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 13:46:39 compute-1 nova_compute[187078]: 2025-11-24 13:46:39.047 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:39 compute-1 nova_compute[187078]: 2025-11-24 13:46:39.603 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:39 compute-1 nova_compute[187078]: 2025-11-24 13:46:39.686 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:42 compute-1 nova_compute[187078]: 2025-11-24 13:46:42.659 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:46:44 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 24 13:46:44 compute-1 podman[219574]: 2025-11-24 13:46:44.271564258 +0000 UTC m=+0.083908499 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 13:46:44 compute-1 podman[219575]: 2025-11-24 13:46:44.300848989 +0000 UTC m=+0.110272712 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:46:44 compute-1 nova_compute[187078]: 2025-11-24 13:46:44.605 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:44 compute-1 nova_compute[187078]: 2025-11-24 13:46:44.687 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:45 compute-1 nova_compute[187078]: 2025-11-24 13:46:45.404 187082 DEBUG nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Creating tmpfile /var/lib/nova/instances/tmp52km8e2b to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 24 13:46:45 compute-1 nova_compute[187078]: 2025-11-24 13:46:45.405 187082 DEBUG nova.compute.manager [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp52km8e2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 24 13:46:47 compute-1 nova_compute[187078]: 2025-11-24 13:46:47.043 187082 DEBUG nova.compute.manager [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp52km8e2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e9ba893-015d-46a7-ba89-d40b181c6c9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 24 13:46:47 compute-1 nova_compute[187078]: 2025-11-24 13:46:47.069 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-9e9ba893-015d-46a7-ba89-d40b181c6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:46:47 compute-1 nova_compute[187078]: 2025-11-24 13:46:47.069 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-9e9ba893-015d-46a7-ba89-d40b181c6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:46:47 compute-1 nova_compute[187078]: 2025-11-24 13:46:47.069 187082 DEBUG nova.network.neutron [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:46:48 compute-1 nova_compute[187078]: 2025-11-24 13:46:48.991 187082 DEBUG nova.network.neutron [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Updating instance_info_cache with network_info: [{"id": "485ea993-42ae-4c99-83e8-bd0e83107400", "address": "fa:16:3e:c2:20:b6", "network": {"id": "54fc22e7-056f-4f9b-ad81-953ca36f7f10", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1602884600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e04aa76fa15645e08bc0a355328db96e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485ea993-42", "ovs_interfaceid": "485ea993-42ae-4c99-83e8-bd0e83107400", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.002 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-9e9ba893-015d-46a7-ba89-d40b181c6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.005 187082 DEBUG nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp52km8e2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e9ba893-015d-46a7-ba89-d40b181c6c9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.005 187082 DEBUG nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Creating instance directory: /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.006 187082 DEBUG nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Creating disk.info with the contents: {'/var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk': 'qcow2', '/var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.007 187082 DEBUG nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.008 187082 DEBUG nova.objects.instance [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9e9ba893-015d-46a7-ba89-d40b181c6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.043 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.108 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.110 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "4a622edf34d6c396497a8622355dd999c6ac487f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.112 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.140 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.208 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.210 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.252 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f,backing_fmt=raw /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.254 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "4a622edf34d6c396497a8622355dd999c6ac487f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.255 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.317 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.320 187082 DEBUG nova.virt.disk.api [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Checking if we can resize image /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.321 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.402 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.404 187082 DEBUG nova.virt.disk.api [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Cannot resize image /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.405 187082 DEBUG nova.objects.instance [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lazy-loading 'migration_context' on Instance uuid 9e9ba893-015d-46a7-ba89-d40b181c6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:46:49 compute-1 openstack_network_exporter[199599]: ERROR   13:46:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:46:49 compute-1 openstack_network_exporter[199599]: ERROR   13:46:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:46:49 compute-1 openstack_network_exporter[199599]: ERROR   13:46:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:46:49 compute-1 openstack_network_exporter[199599]: ERROR   13:46:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:46:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:46:49 compute-1 openstack_network_exporter[199599]: ERROR   13:46:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:46:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.423 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.453 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk.config 485376" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.454 187082 DEBUG nova.virt.libvirt.volume.remotefs [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk.config to /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.454 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk.config /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.607 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.689 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.974 187082 DEBUG oslo_concurrency.processutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c/disk.config /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.975 187082 DEBUG nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.977 187082 DEBUG nova.virt.libvirt.vif [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T13:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-232939924',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-232939924',id=29,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:46:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e04aa76fa15645e08bc0a355328db96e',ramdisk_id='',reservation_id='r-shwd1j0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-781819625',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-781819625-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T13:46:20Z,user_data=None,user_id='a8565a9de374413d99e6b1cd58da0895',uuid=9e9ba893-015d-46a7-ba89-d40b181c6c9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "485ea993-42ae-4c99-83e8-bd0e83107400", "address": "fa:16:3e:c2:20:b6", "network": {"id": "54fc22e7-056f-4f9b-ad81-953ca36f7f10", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1602884600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e04aa76fa15645e08bc0a355328db96e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap485ea993-42", "ovs_interfaceid": "485ea993-42ae-4c99-83e8-bd0e83107400", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.977 187082 DEBUG nova.network.os_vif_util [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converting VIF {"id": "485ea993-42ae-4c99-83e8-bd0e83107400", "address": "fa:16:3e:c2:20:b6", "network": {"id": "54fc22e7-056f-4f9b-ad81-953ca36f7f10", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1602884600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e04aa76fa15645e08bc0a355328db96e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap485ea993-42", "ovs_interfaceid": "485ea993-42ae-4c99-83e8-bd0e83107400", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.978 187082 DEBUG nova.network.os_vif_util [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:20:b6,bridge_name='br-int',has_traffic_filtering=True,id=485ea993-42ae-4c99-83e8-bd0e83107400,network=Network(54fc22e7-056f-4f9b-ad81-953ca36f7f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485ea993-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.978 187082 DEBUG os_vif [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:20:b6,bridge_name='br-int',has_traffic_filtering=True,id=485ea993-42ae-4c99-83e8-bd0e83107400,network=Network(54fc22e7-056f-4f9b-ad81-953ca36f7f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485ea993-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.979 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.980 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.981 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.983 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.984 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap485ea993-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:46:49 compute-1 nova_compute[187078]: 2025-11-24 13:46:49.984 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap485ea993-42, col_values=(('external_ids', {'iface-id': '485ea993-42ae-4c99-83e8-bd0e83107400', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:20:b6', 'vm-uuid': '9e9ba893-015d-46a7-ba89-d40b181c6c9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:46:50 compute-1 NetworkManager[55527]: <info>  [1763992010.0211] manager: (tap485ea993-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Nov 24 13:46:50 compute-1 nova_compute[187078]: 2025-11-24 13:46:50.020 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:50 compute-1 nova_compute[187078]: 2025-11-24 13:46:50.024 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 13:46:50 compute-1 nova_compute[187078]: 2025-11-24 13:46:50.026 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:50 compute-1 nova_compute[187078]: 2025-11-24 13:46:50.028 187082 INFO os_vif [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:20:b6,bridge_name='br-int',has_traffic_filtering=True,id=485ea993-42ae-4c99-83e8-bd0e83107400,network=Network(54fc22e7-056f-4f9b-ad81-953ca36f7f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485ea993-42')
Nov 24 13:46:50 compute-1 nova_compute[187078]: 2025-11-24 13:46:50.029 187082 DEBUG nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 24 13:46:50 compute-1 nova_compute[187078]: 2025-11-24 13:46:50.029 187082 DEBUG nova.compute.manager [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp52km8e2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e9ba893-015d-46a7-ba89-d40b181c6c9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 24 13:46:51 compute-1 nova_compute[187078]: 2025-11-24 13:46:51.887 187082 DEBUG nova.network.neutron [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Port 485ea993-42ae-4c99-83e8-bd0e83107400 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 24 13:46:51 compute-1 nova_compute[187078]: 2025-11-24 13:46:51.889 187082 DEBUG nova.compute.manager [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp52km8e2b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e9ba893-015d-46a7-ba89-d40b181c6c9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 24 13:46:52 compute-1 systemd[1]: Starting libvirt proxy daemon...
Nov 24 13:46:52 compute-1 systemd[1]: Started libvirt proxy daemon.
Nov 24 13:46:52 compute-1 kernel: tap485ea993-42: entered promiscuous mode
Nov 24 13:46:52 compute-1 NetworkManager[55527]: <info>  [1763992012.2192] manager: (tap485ea993-42): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Nov 24 13:46:52 compute-1 ovn_controller[95368]: 2025-11-24T13:46:52Z|00240|binding|INFO|Claiming lport 485ea993-42ae-4c99-83e8-bd0e83107400 for this additional chassis.
Nov 24 13:46:52 compute-1 nova_compute[187078]: 2025-11-24 13:46:52.261 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:52 compute-1 ovn_controller[95368]: 2025-11-24T13:46:52Z|00241|binding|INFO|485ea993-42ae-4c99-83e8-bd0e83107400: Claiming fa:16:3e:c2:20:b6 10.100.0.12
Nov 24 13:46:52 compute-1 systemd-udevd[219673]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:46:52 compute-1 nova_compute[187078]: 2025-11-24 13:46:52.265 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:52 compute-1 NetworkManager[55527]: <info>  [1763992012.2773] device (tap485ea993-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:46:52 compute-1 NetworkManager[55527]: <info>  [1763992012.2786] device (tap485ea993-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 13:46:52 compute-1 systemd-machined[153355]: New machine qemu-21-instance-0000001d.
Nov 24 13:46:52 compute-1 systemd[1]: Started Virtual Machine qemu-21-instance-0000001d.
Nov 24 13:46:52 compute-1 ovn_controller[95368]: 2025-11-24T13:46:52Z|00242|binding|INFO|Setting lport 485ea993-42ae-4c99-83e8-bd0e83107400 ovn-installed in OVS
Nov 24 13:46:52 compute-1 nova_compute[187078]: 2025-11-24 13:46:52.322 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:52 compute-1 nova_compute[187078]: 2025-11-24 13:46:52.324 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:54 compute-1 nova_compute[187078]: 2025-11-24 13:46:54.690 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:54 compute-1 nova_compute[187078]: 2025-11-24 13:46:54.788 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763992014.785136, 9e9ba893-015d-46a7-ba89-d40b181c6c9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:46:54 compute-1 nova_compute[187078]: 2025-11-24 13:46:54.789 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] VM Started (Lifecycle Event)
Nov 24 13:46:54 compute-1 nova_compute[187078]: 2025-11-24 13:46:54.812 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:46:55 compute-1 nova_compute[187078]: 2025-11-24 13:46:55.046 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:55 compute-1 nova_compute[187078]: 2025-11-24 13:46:55.524 187082 DEBUG nova.virt.driver [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] Emitting event <LifecycleEvent: 1763992015.5237901, 9e9ba893-015d-46a7-ba89-d40b181c6c9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:46:55 compute-1 nova_compute[187078]: 2025-11-24 13:46:55.524 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] VM Resumed (Lifecycle Event)
Nov 24 13:46:55 compute-1 nova_compute[187078]: 2025-11-24 13:46:55.546 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:46:55 compute-1 nova_compute[187078]: 2025-11-24 13:46:55.550 187082 DEBUG nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 13:46:55 compute-1 podman[219704]: 2025-11-24 13:46:55.555803386 +0000 UTC m=+0.099825880 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter)
Nov 24 13:46:55 compute-1 nova_compute[187078]: 2025-11-24 13:46:55.579 187082 INFO nova.compute.manager [None req-ab40100a-2f96-4371-b1a4-aa84ef12b9ad - - - - - -] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Nov 24 13:46:57 compute-1 ovn_controller[95368]: 2025-11-24T13:46:57Z|00243|binding|INFO|Claiming lport 485ea993-42ae-4c99-83e8-bd0e83107400 for this chassis.
Nov 24 13:46:57 compute-1 ovn_controller[95368]: 2025-11-24T13:46:57Z|00244|binding|INFO|485ea993-42ae-4c99-83e8-bd0e83107400: Claiming fa:16:3e:c2:20:b6 10.100.0.12
Nov 24 13:46:57 compute-1 ovn_controller[95368]: 2025-11-24T13:46:57Z|00245|binding|INFO|Setting lport 485ea993-42ae-4c99-83e8-bd0e83107400 up in Southbound
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.016 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:20:b6 10.100.0.12'], port_security=['fa:16:3e:c2:20:b6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9e9ba893-015d-46a7-ba89-d40b181c6c9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54fc22e7-056f-4f9b-ad81-953ca36f7f10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e04aa76fa15645e08bc0a355328db96e', 'neutron:revision_number': '11', 'neutron:security_group_ids': '67540aec-662a-4651-9335-e0f9144b5918', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ab615d1-01de-4194-b82e-d991324f3e21, chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=485ea993-42ae-4c99-83e8-bd0e83107400) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.017 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 485ea993-42ae-4c99-83e8-bd0e83107400 in datapath 54fc22e7-056f-4f9b-ad81-953ca36f7f10 bound to our chassis
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.019 104225 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 54fc22e7-056f-4f9b-ad81-953ca36f7f10
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.040 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0c75b31b-9aa3-4f3e-b15e-323a77199cec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.042 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap54fc22e7-01 in ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.047 208599 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap54fc22e7-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.047 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[4a60ab1d-4710-4780-bfd3-77c28139587c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.048 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[f89c7376-3a26-458b-99d4-ea6fab2747e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.063 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[79e1694f-0661-46c0-864f-dbb67e3ac564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.092 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[d73f603c-445c-4ce0-b97e-cdec11815cbb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.125 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[afa06bec-3323-4405-a8ed-b3ce9ea8c74b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.138 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8867a9d5-9842-4b01-8e76-b4e6ab517452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 NetworkManager[55527]: <info>  [1763992017.1389] manager: (tap54fc22e7-00): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Nov 24 13:46:57 compute-1 systemd-udevd[219732]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:46:57 compute-1 nova_compute[187078]: 2025-11-24 13:46:57.170 187082 INFO nova.compute.manager [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Post operation of migration started
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.183 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[d50b5ae8-85e4-4b57-9fee-9ed4900cf8c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.188 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[0437308e-6315-470e-9cb2-82e9b71e6dd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 NetworkManager[55527]: <info>  [1763992017.2232] device (tap54fc22e7-00): carrier: link connected
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.230 208613 DEBUG oslo.privsep.daemon [-] privsep: reply[2764cddb-6f40-40ab-8b5d-808c15178fe6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.252 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[a2369a5c-9219-49c2-8620-aa67f58d798e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54fc22e7-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:06:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503588, 'reachable_time': 43375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219751, 'error': None, 'target': 'ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.276 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[1ebf331e-a793-47c7-9543-4e54e07cb080]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:659'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503588, 'tstamp': 503588}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219752, 'error': None, 'target': 'ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.305 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[0abdb51e-f3f2-48ad-a8f7-b5607d05a399]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54fc22e7-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:06:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503588, 'reachable_time': 43375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219753, 'error': None, 'target': 'ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.348 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[08679c16-2191-4cf4-897b-689a8ab7debb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.434 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[25b57901-ed2b-493d-a9f5-da13cecb6589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.435 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54fc22e7-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.435 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.436 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54fc22e7-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:46:57 compute-1 nova_compute[187078]: 2025-11-24 13:46:57.438 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:57 compute-1 NetworkManager[55527]: <info>  [1763992017.4395] manager: (tap54fc22e7-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 24 13:46:57 compute-1 kernel: tap54fc22e7-00: entered promiscuous mode
Nov 24 13:46:57 compute-1 nova_compute[187078]: 2025-11-24 13:46:57.441 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.443 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap54fc22e7-00, col_values=(('external_ids', {'iface-id': '9323ed5b-fd7b-47b8-b12e-38859e7f063a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:46:57 compute-1 ovn_controller[95368]: 2025-11-24T13:46:57Z|00246|binding|INFO|Releasing lport 9323ed5b-fd7b-47b8-b12e-38859e7f063a from this chassis (sb_readonly=0)
Nov 24 13:46:57 compute-1 nova_compute[187078]: 2025-11-24 13:46:57.444 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:57 compute-1 nova_compute[187078]: 2025-11-24 13:46:57.460 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.461 104225 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/54fc22e7-056f-4f9b-ad81-953ca36f7f10.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/54fc22e7-056f-4f9b-ad81-953ca36f7f10.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.462 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[d243961e-eaee-48cd-9e68-0884771d7f26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.463 104225 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: global
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     log         /dev/log local0 debug
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     log-tag     haproxy-metadata-proxy-54fc22e7-056f-4f9b-ad81-953ca36f7f10
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     user        root
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     group       root
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     maxconn     1024
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     pidfile     /var/lib/neutron/external/pids/54fc22e7-056f-4f9b-ad81-953ca36f7f10.pid.haproxy
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     daemon
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: defaults
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     log global
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     mode http
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     option httplog
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     option dontlognull
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     option http-server-close
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     option forwardfor
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     retries                 3
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     timeout http-request    30s
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     timeout connect         30s
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     timeout client          32s
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     timeout server          32s
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     timeout http-keep-alive 30s
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: listen listener
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     bind 169.254.169.254:80
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:     http-request add-header X-OVN-Network-ID 54fc22e7-056f-4f9b-ad81-953ca36f7f10
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 13:46:57 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:46:57.464 104225 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10', 'env', 'PROCESS_TAG=haproxy-54fc22e7-056f-4f9b-ad81-953ca36f7f10', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/54fc22e7-056f-4f9b-ad81-953ca36f7f10.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 13:46:57 compute-1 podman[219787]: 2025-11-24 13:46:57.83685598 +0000 UTC m=+0.051525004 container create 8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 24 13:46:57 compute-1 systemd[1]: Started libpod-conmon-8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4.scope.
Nov 24 13:46:57 compute-1 nova_compute[187078]: 2025-11-24 13:46:57.885 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "refresh_cache-9e9ba893-015d-46a7-ba89-d40b181c6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 13:46:57 compute-1 nova_compute[187078]: 2025-11-24 13:46:57.886 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquired lock "refresh_cache-9e9ba893-015d-46a7-ba89-d40b181c6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 13:46:57 compute-1 nova_compute[187078]: 2025-11-24 13:46:57.886 187082 DEBUG nova.network.neutron [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 13:46:57 compute-1 systemd[1]: Started libcrun container.
Nov 24 13:46:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e5f41db998fdf87d7ad42f27ce9b154f5721a64b2924cb09fc4e860592af971/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 13:46:57 compute-1 podman[219787]: 2025-11-24 13:46:57.811590287 +0000 UTC m=+0.026259331 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 24 13:46:57 compute-1 podman[219787]: 2025-11-24 13:46:57.923145903 +0000 UTC m=+0.137814957 container init 8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 24 13:46:57 compute-1 podman[219787]: 2025-11-24 13:46:57.933141983 +0000 UTC m=+0.147811007 container start 8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 13:46:57 compute-1 neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10[219802]: [NOTICE]   (219806) : New worker (219808) forked
Nov 24 13:46:57 compute-1 neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10[219802]: [NOTICE]   (219806) : Loading success.
Nov 24 13:46:59 compute-1 nova_compute[187078]: 2025-11-24 13:46:59.200 187082 DEBUG nova.network.neutron [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Updating instance_info_cache with network_info: [{"id": "485ea993-42ae-4c99-83e8-bd0e83107400", "address": "fa:16:3e:c2:20:b6", "network": {"id": "54fc22e7-056f-4f9b-ad81-953ca36f7f10", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1602884600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e04aa76fa15645e08bc0a355328db96e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485ea993-42", "ovs_interfaceid": "485ea993-42ae-4c99-83e8-bd0e83107400", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:46:59 compute-1 nova_compute[187078]: 2025-11-24 13:46:59.221 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Releasing lock "refresh_cache-9e9ba893-015d-46a7-ba89-d40b181c6c9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 13:46:59 compute-1 nova_compute[187078]: 2025-11-24 13:46:59.233 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:46:59 compute-1 nova_compute[187078]: 2025-11-24 13:46:59.235 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:46:59 compute-1 nova_compute[187078]: 2025-11-24 13:46:59.235 187082 DEBUG oslo_concurrency.lockutils [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:46:59 compute-1 nova_compute[187078]: 2025-11-24 13:46:59.241 187082 INFO nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 24 13:46:59 compute-1 virtqemud[186628]: Domain id=21 name='instance-0000001d' uuid=9e9ba893-015d-46a7-ba89-d40b181c6c9c is tainted: custom-monitor
Nov 24 13:46:59 compute-1 nova_compute[187078]: 2025-11-24 13:46:59.692 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:00 compute-1 nova_compute[187078]: 2025-11-24 13:47:00.050 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:00 compute-1 nova_compute[187078]: 2025-11-24 13:47:00.249 187082 INFO nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 24 13:47:01 compute-1 nova_compute[187078]: 2025-11-24 13:47:01.256 187082 INFO nova.virt.libvirt.driver [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 24 13:47:01 compute-1 nova_compute[187078]: 2025-11-24 13:47:01.261 187082 DEBUG nova.compute.manager [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:47:01 compute-1 nova_compute[187078]: 2025-11-24 13:47:01.279 187082 DEBUG nova.objects.instance [None req-9c719ab7-b2f5-4d5b-a209-1830f1a7af38 25d68f11f3f44d42b6a7f35a440a2a7a e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.946 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Acquiring lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.946 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.947 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Acquiring lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.948 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.948 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.950 187082 INFO nova.compute.manager [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Terminating instance
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.952 187082 DEBUG nova.compute.manager [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 13:47:03 compute-1 kernel: tap485ea993-42 (unregistering): left promiscuous mode
Nov 24 13:47:03 compute-1 NetworkManager[55527]: <info>  [1763992023.9732] device (tap485ea993-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 13:47:03 compute-1 ovn_controller[95368]: 2025-11-24T13:47:03Z|00247|binding|INFO|Releasing lport 485ea993-42ae-4c99-83e8-bd0e83107400 from this chassis (sb_readonly=0)
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.979 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:03 compute-1 ovn_controller[95368]: 2025-11-24T13:47:03Z|00248|binding|INFO|Setting lport 485ea993-42ae-4c99-83e8-bd0e83107400 down in Southbound
Nov 24 13:47:03 compute-1 ovn_controller[95368]: 2025-11-24T13:47:03Z|00249|binding|INFO|Removing iface tap485ea993-42 ovn-installed in OVS
Nov 24 13:47:03 compute-1 nova_compute[187078]: 2025-11-24 13:47:03.997 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.028 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:20:b6 10.100.0.12'], port_security=['fa:16:3e:c2:20:b6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9e9ba893-015d-46a7-ba89-d40b181c6c9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54fc22e7-056f-4f9b-ad81-953ca36f7f10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e04aa76fa15645e08bc0a355328db96e', 'neutron:revision_number': '13', 'neutron:security_group_ids': '67540aec-662a-4651-9335-e0f9144b5918', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ab615d1-01de-4194-b82e-d991324f3e21, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>], logical_port=485ea993-42ae-4c99-83e8-bd0e83107400) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f588fc1e8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.031 104225 INFO neutron.agent.ovn.metadata.agent [-] Port 485ea993-42ae-4c99-83e8-bd0e83107400 in datapath 54fc22e7-056f-4f9b-ad81-953ca36f7f10 unbound from our chassis
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.033 104225 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54fc22e7-056f-4f9b-ad81-953ca36f7f10, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.034 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b568ac13-f071-484a-95ea-d4f2361e0211]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.035 104225 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10 namespace which is not needed anymore
Nov 24 13:47:04 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Nov 24 13:47:04 compute-1 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Consumed 3.330s CPU time.
Nov 24 13:47:04 compute-1 systemd-machined[153355]: Machine qemu-21-instance-0000001d terminated.
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.175 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.175 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.176 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:47:04 compute-1 neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10[219802]: [NOTICE]   (219806) : haproxy version is 2.8.14-c23fe91
Nov 24 13:47:04 compute-1 neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10[219802]: [NOTICE]   (219806) : path to executable is /usr/sbin/haproxy
Nov 24 13:47:04 compute-1 neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10[219802]: [WARNING]  (219806) : Exiting Master process...
Nov 24 13:47:04 compute-1 neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10[219802]: [ALERT]    (219806) : Current worker (219808) exited with code 143 (Terminated)
Nov 24 13:47:04 compute-1 neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10[219802]: [WARNING]  (219806) : All workers exited. Exiting... (0)
Nov 24 13:47:04 compute-1 systemd[1]: libpod-8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4.scope: Deactivated successfully.
Nov 24 13:47:04 compute-1 podman[219839]: 2025-11-24 13:47:04.209955664 +0000 UTC m=+0.050887777 container died 8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.227 187082 INFO nova.virt.libvirt.driver [-] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Instance destroyed successfully.
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.228 187082 DEBUG nova.objects.instance [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Lazy-loading 'resources' on Instance uuid 9e9ba893-015d-46a7-ba89-d40b181c6c9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 13:47:04 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4-userdata-shm.mount: Deactivated successfully.
Nov 24 13:47:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-9e5f41db998fdf87d7ad42f27ce9b154f5721a64b2924cb09fc4e860592af971-merged.mount: Deactivated successfully.
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.243 187082 DEBUG nova.virt.libvirt.vif [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-24T13:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-232939924',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-232939924',id=29,image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-24T13:46:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e04aa76fa15645e08bc0a355328db96e',ramdisk_id='',reservation_id='r-shwd1j0b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='1d4afc77-cb95-49a2-9165-f8ceca2998fc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-781819625',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-781819625-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T13:47:01Z,user_data=None,user_id='a8565a9de374413d99e6b1cd58da0895',uuid=9e9ba893-015d-46a7-ba89-d40b181c6c9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "485ea993-42ae-4c99-83e8-bd0e83107400", "address": "fa:16:3e:c2:20:b6", "network": {"id": "54fc22e7-056f-4f9b-ad81-953ca36f7f10", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1602884600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e04aa76fa15645e08bc0a355328db96e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485ea993-42", "ovs_interfaceid": "485ea993-42ae-4c99-83e8-bd0e83107400", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.243 187082 DEBUG nova.network.os_vif_util [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Converting VIF {"id": "485ea993-42ae-4c99-83e8-bd0e83107400", "address": "fa:16:3e:c2:20:b6", "network": {"id": "54fc22e7-056f-4f9b-ad81-953ca36f7f10", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1602884600-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e04aa76fa15645e08bc0a355328db96e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485ea993-42", "ovs_interfaceid": "485ea993-42ae-4c99-83e8-bd0e83107400", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 13:47:04 compute-1 podman[219839]: 2025-11-24 13:47:04.244453867 +0000 UTC m=+0.085385970 container cleanup 8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.244 187082 DEBUG nova.network.os_vif_util [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:20:b6,bridge_name='br-int',has_traffic_filtering=True,id=485ea993-42ae-4c99-83e8-bd0e83107400,network=Network(54fc22e7-056f-4f9b-ad81-953ca36f7f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485ea993-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.244 187082 DEBUG os_vif [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:20:b6,bridge_name='br-int',has_traffic_filtering=True,id=485ea993-42ae-4c99-83e8-bd0e83107400,network=Network(54fc22e7-056f-4f9b-ad81-953ca36f7f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485ea993-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.245 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.246 187082 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap485ea993-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.247 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.248 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.251 187082 INFO os_vif [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:20:b6,bridge_name='br-int',has_traffic_filtering=True,id=485ea993-42ae-4c99-83e8-bd0e83107400,network=Network(54fc22e7-056f-4f9b-ad81-953ca36f7f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485ea993-42')
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.251 187082 INFO nova.virt.libvirt.driver [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Deleting instance files /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c_del
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.252 187082 INFO nova.virt.libvirt.driver [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Deletion of /var/lib/nova/instances/9e9ba893-015d-46a7-ba89-d40b181c6c9c_del complete
Nov 24 13:47:04 compute-1 systemd[1]: libpod-conmon-8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4.scope: Deactivated successfully.
Nov 24 13:47:04 compute-1 podman[219885]: 2025-11-24 13:47:04.303318988 +0000 UTC m=+0.034024931 container remove 8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.310 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[5498ba5f-76fc-47a1-9ef8-069b5a2a5ae1]: (4, ('Mon Nov 24 01:47:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10 (8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4)\n8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4\nMon Nov 24 01:47:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10 (8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4)\n8ce748265efda074a796375626848bf8f7de559307093d2be8a289abf1ae75b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.312 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[313f8ea9-0f53-4a99-93bf-a1b37ee6a468]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.313 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54fc22e7-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.315 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:04 compute-1 kernel: tap54fc22e7-00: left promiscuous mode
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.326 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.328 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[8bdaac06-2e1c-4851-9280-fdbbed356cde]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.333 187082 INFO nova.compute.manager [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Took 0.38 seconds to destroy the instance on the hypervisor.
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.333 187082 DEBUG oslo.service.loopingcall [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.334 187082 DEBUG nova.compute.manager [-] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.334 187082 DEBUG nova.network.neutron [-] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.345 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[1a24aa95-c15e-43d3-a417-91bd2f8ecde1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.346 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[663cd884-5631-4e28-8066-68f5ff9adc67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.373 208599 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ef1908-3632-43a5-8d8f-671deefa2429]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503577, 'reachable_time': 33857, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219900, 'error': None, 'target': 'ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.375 104336 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-54fc22e7-056f-4f9b-ad81-953ca36f7f10 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 13:47:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:04.375 104336 DEBUG oslo.privsep.daemon [-] privsep: reply[30df810f-58a1-47eb-8598-d0805f955515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 13:47:04 compute-1 systemd[1]: run-netns-ovnmeta\x2d54fc22e7\x2d056f\x2d4f9b\x2dad81\x2d953ca36f7f10.mount: Deactivated successfully.
Nov 24 13:47:04 compute-1 nova_compute[187078]: 2025-11-24 13:47:04.695 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:05 compute-1 podman[197429]: time="2025-11-24T13:47:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:47:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:47:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:47:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:47:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 24 13:47:05 compute-1 nova_compute[187078]: 2025-11-24 13:47:05.795 187082 DEBUG nova.compute.manager [req-db7770c7-478d-4b00-bd8e-9768bcf33cd9 req-46ad8170-a4b3-4c33-b896-e71273695528 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Received event network-vif-unplugged-485ea993-42ae-4c99-83e8-bd0e83107400 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:47:05 compute-1 nova_compute[187078]: 2025-11-24 13:47:05.795 187082 DEBUG oslo_concurrency.lockutils [req-db7770c7-478d-4b00-bd8e-9768bcf33cd9 req-46ad8170-a4b3-4c33-b896-e71273695528 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:47:05 compute-1 nova_compute[187078]: 2025-11-24 13:47:05.795 187082 DEBUG oslo_concurrency.lockutils [req-db7770c7-478d-4b00-bd8e-9768bcf33cd9 req-46ad8170-a4b3-4c33-b896-e71273695528 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:47:05 compute-1 nova_compute[187078]: 2025-11-24 13:47:05.795 187082 DEBUG oslo_concurrency.lockutils [req-db7770c7-478d-4b00-bd8e-9768bcf33cd9 req-46ad8170-a4b3-4c33-b896-e71273695528 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:47:05 compute-1 nova_compute[187078]: 2025-11-24 13:47:05.796 187082 DEBUG nova.compute.manager [req-db7770c7-478d-4b00-bd8e-9768bcf33cd9 req-46ad8170-a4b3-4c33-b896-e71273695528 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] No waiting events found dispatching network-vif-unplugged-485ea993-42ae-4c99-83e8-bd0e83107400 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:47:05 compute-1 nova_compute[187078]: 2025-11-24 13:47:05.796 187082 DEBUG nova.compute.manager [req-db7770c7-478d-4b00-bd8e-9768bcf33cd9 req-46ad8170-a4b3-4c33-b896-e71273695528 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Received event network-vif-unplugged-485ea993-42ae-4c99-83e8-bd0e83107400 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 13:47:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:06.004 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:47:06 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:06.005 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:47:06 compute-1 nova_compute[187078]: 2025-11-24 13:47:06.005 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:06 compute-1 nova_compute[187078]: 2025-11-24 13:47:06.055 187082 DEBUG nova.network.neutron [-] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 13:47:06 compute-1 nova_compute[187078]: 2025-11-24 13:47:06.066 187082 INFO nova.compute.manager [-] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Took 1.73 seconds to deallocate network for instance.
Nov 24 13:47:06 compute-1 nova_compute[187078]: 2025-11-24 13:47:06.107 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:47:06 compute-1 nova_compute[187078]: 2025-11-24 13:47:06.107 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:47:06 compute-1 nova_compute[187078]: 2025-11-24 13:47:06.113 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:47:06 compute-1 nova_compute[187078]: 2025-11-24 13:47:06.144 187082 INFO nova.scheduler.client.report [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Deleted allocations for instance 9e9ba893-015d-46a7-ba89-d40b181c6c9c
Nov 24 13:47:06 compute-1 nova_compute[187078]: 2025-11-24 13:47:06.192 187082 DEBUG oslo_concurrency.lockutils [None req-d62ce8a2-e910-45fc-8a55-9ee30918e0f0 a8565a9de374413d99e6b1cd58da0895 e04aa76fa15645e08bc0a355328db96e - - default default] Lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:47:07 compute-1 nova_compute[187078]: 2025-11-24 13:47:07.865 187082 DEBUG nova.compute.manager [req-d8b41e8a-2742-4ccc-83f2-2465cda5da2d req-9e767aef-12b3-46d7-aefa-c44dfed5e582 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Received event network-vif-plugged-485ea993-42ae-4c99-83e8-bd0e83107400 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:47:07 compute-1 nova_compute[187078]: 2025-11-24 13:47:07.865 187082 DEBUG oslo_concurrency.lockutils [req-d8b41e8a-2742-4ccc-83f2-2465cda5da2d req-9e767aef-12b3-46d7-aefa-c44dfed5e582 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Acquiring lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:47:07 compute-1 nova_compute[187078]: 2025-11-24 13:47:07.866 187082 DEBUG oslo_concurrency.lockutils [req-d8b41e8a-2742-4ccc-83f2-2465cda5da2d req-9e767aef-12b3-46d7-aefa-c44dfed5e582 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:47:07 compute-1 nova_compute[187078]: 2025-11-24 13:47:07.866 187082 DEBUG oslo_concurrency.lockutils [req-d8b41e8a-2742-4ccc-83f2-2465cda5da2d req-9e767aef-12b3-46d7-aefa-c44dfed5e582 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] Lock "9e9ba893-015d-46a7-ba89-d40b181c6c9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:47:07 compute-1 nova_compute[187078]: 2025-11-24 13:47:07.866 187082 DEBUG nova.compute.manager [req-d8b41e8a-2742-4ccc-83f2-2465cda5da2d req-9e767aef-12b3-46d7-aefa-c44dfed5e582 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] No waiting events found dispatching network-vif-plugged-485ea993-42ae-4c99-83e8-bd0e83107400 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 13:47:07 compute-1 nova_compute[187078]: 2025-11-24 13:47:07.866 187082 WARNING nova.compute.manager [req-d8b41e8a-2742-4ccc-83f2-2465cda5da2d req-9e767aef-12b3-46d7-aefa-c44dfed5e582 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Received unexpected event network-vif-plugged-485ea993-42ae-4c99-83e8-bd0e83107400 for instance with vm_state deleted and task_state None.
Nov 24 13:47:07 compute-1 nova_compute[187078]: 2025-11-24 13:47:07.866 187082 DEBUG nova.compute.manager [req-d8b41e8a-2742-4ccc-83f2-2465cda5da2d req-9e767aef-12b3-46d7-aefa-c44dfed5e582 0871b051f90e4d9abc41a767716e94b4 e2988f19c61e464fa69547c5f841fd36 - - default default] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Received event network-vif-deleted-485ea993-42ae-4c99-83e8-bd0e83107400 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 13:47:09 compute-1 nova_compute[187078]: 2025-11-24 13:47:09.248 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:09 compute-1 podman[219902]: 2025-11-24 13:47:09.532604571 +0000 UTC m=+0.075165163 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 13:47:09 compute-1 podman[219901]: 2025-11-24 13:47:09.539613721 +0000 UTC m=+0.086176621 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:47:09 compute-1 nova_compute[187078]: 2025-11-24 13:47:09.697 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:10 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:47:10.007 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:47:14 compute-1 sshd-session[219946]: Invalid user admin1234 from 45.78.194.40 port 32808
Nov 24 13:47:14 compute-1 nova_compute[187078]: 2025-11-24 13:47:14.250 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:14 compute-1 sshd-session[219946]: Received disconnect from 45.78.194.40 port 32808:11: Bye Bye [preauth]
Nov 24 13:47:14 compute-1 sshd-session[219946]: Disconnected from invalid user admin1234 45.78.194.40 port 32808 [preauth]
Nov 24 13:47:14 compute-1 podman[219948]: 2025-11-24 13:47:14.549811711 +0000 UTC m=+0.091421301 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 24 13:47:14 compute-1 podman[219949]: 2025-11-24 13:47:14.626634609 +0000 UTC m=+0.154818937 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:47:14 compute-1 nova_compute[187078]: 2025-11-24 13:47:14.699 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:19 compute-1 nova_compute[187078]: 2025-11-24 13:47:19.226 187082 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763992024.2250009, 9e9ba893-015d-46a7-ba89-d40b181c6c9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 13:47:19 compute-1 nova_compute[187078]: 2025-11-24 13:47:19.226 187082 INFO nova.compute.manager [-] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] VM Stopped (Lifecycle Event)
Nov 24 13:47:19 compute-1 nova_compute[187078]: 2025-11-24 13:47:19.241 187082 DEBUG nova.compute.manager [None req-37e19621-9f43-49d5-b395-03b5e9ed4b8d - - - - - -] [instance: 9e9ba893-015d-46a7-ba89-d40b181c6c9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 13:47:19 compute-1 nova_compute[187078]: 2025-11-24 13:47:19.252 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:19 compute-1 openstack_network_exporter[199599]: ERROR   13:47:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:47:19 compute-1 openstack_network_exporter[199599]: ERROR   13:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:47:19 compute-1 openstack_network_exporter[199599]: ERROR   13:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:47:19 compute-1 openstack_network_exporter[199599]: ERROR   13:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:47:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:47:19 compute-1 openstack_network_exporter[199599]: ERROR   13:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:47:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:47:19 compute-1 nova_compute[187078]: 2025-11-24 13:47:19.737 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:21 compute-1 sshd-session[219992]: Invalid user sol from 193.32.162.146 port 58024
Nov 24 13:47:22 compute-1 sshd-session[219992]: Connection closed by invalid user sol 193.32.162.146 port 58024 [preauth]
Nov 24 13:47:24 compute-1 nova_compute[187078]: 2025-11-24 13:47:24.253 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:24 compute-1 nova_compute[187078]: 2025-11-24 13:47:24.740 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:25 compute-1 nova_compute[187078]: 2025-11-24 13:47:25.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:26 compute-1 podman[219994]: 2025-11-24 13:47:26.497147296 +0000 UTC m=+0.048814050 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 13:47:28 compute-1 nova_compute[187078]: 2025-11-24 13:47:28.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:29 compute-1 nova_compute[187078]: 2025-11-24 13:47:29.255 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:29 compute-1 nova_compute[187078]: 2025-11-24 13:47:29.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:29 compute-1 nova_compute[187078]: 2025-11-24 13:47:29.741 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:30 compute-1 nova_compute[187078]: 2025-11-24 13:47:30.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:32 compute-1 nova_compute[187078]: 2025-11-24 13:47:32.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:34 compute-1 nova_compute[187078]: 2025-11-24 13:47:34.257 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:34 compute-1 nova_compute[187078]: 2025-11-24 13:47:34.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:34 compute-1 nova_compute[187078]: 2025-11-24 13:47:34.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:47:34 compute-1 nova_compute[187078]: 2025-11-24 13:47:34.743 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:35 compute-1 podman[197429]: time="2025-11-24T13:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:47:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:47:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.696 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.696 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.697 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.697 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.881 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.882 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5859MB free_disk=73.45538330078125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.882 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.882 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.951 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.952 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.978 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.994 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.996 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:47:35 compute-1 nova_compute[187078]: 2025-11-24 13:47:35.996 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:47:36 compute-1 ovn_controller[95368]: 2025-11-24T13:47:36Z|00250|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Nov 24 13:47:36 compute-1 nova_compute[187078]: 2025-11-24 13:47:36.996 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:36 compute-1 nova_compute[187078]: 2025-11-24 13:47:36.997 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:47:36 compute-1 nova_compute[187078]: 2025-11-24 13:47:36.997 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:47:37 compute-1 nova_compute[187078]: 2025-11-24 13:47:37.022 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:47:39 compute-1 nova_compute[187078]: 2025-11-24 13:47:39.287 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:39 compute-1 nova_compute[187078]: 2025-11-24 13:47:39.746 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:40 compute-1 podman[220018]: 2025-11-24 13:47:40.509568944 +0000 UTC m=+0.045454040 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:47:40 compute-1 podman[220017]: 2025-11-24 13:47:40.534722174 +0000 UTC m=+0.074737452 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:47:40 compute-1 nova_compute[187078]: 2025-11-24 13:47:40.686 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:47:42 compute-1 nova_compute[187078]: 2025-11-24 13:47:42.131 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:42 compute-1 sshd-session[220061]: Received disconnect from 175.100.24.139 port 57200:11: Bye Bye [preauth]
Nov 24 13:47:42 compute-1 sshd-session[220061]: Disconnected from authenticating user root 175.100.24.139 port 57200 [preauth]
Nov 24 13:47:44 compute-1 nova_compute[187078]: 2025-11-24 13:47:44.289 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:44 compute-1 nova_compute[187078]: 2025-11-24 13:47:44.749 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:45 compute-1 podman[220063]: 2025-11-24 13:47:45.525720737 +0000 UTC m=+0.077236740 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 24 13:47:45 compute-1 podman[220064]: 2025-11-24 13:47:45.574720091 +0000 UTC m=+0.112210035 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:47:49 compute-1 nova_compute[187078]: 2025-11-24 13:47:49.292 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:49 compute-1 openstack_network_exporter[199599]: ERROR   13:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:47:49 compute-1 openstack_network_exporter[199599]: ERROR   13:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:47:49 compute-1 openstack_network_exporter[199599]: ERROR   13:47:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:47:49 compute-1 openstack_network_exporter[199599]: ERROR   13:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:47:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:47:49 compute-1 openstack_network_exporter[199599]: ERROR   13:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:47:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:47:49 compute-1 nova_compute[187078]: 2025-11-24 13:47:49.752 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:50 compute-1 sshd-session[220110]: Invalid user user from 45.148.10.240 port 35106
Nov 24 13:47:50 compute-1 sshd-session[220110]: Connection closed by invalid user user 45.148.10.240 port 35106 [preauth]
Nov 24 13:47:54 compute-1 nova_compute[187078]: 2025-11-24 13:47:54.294 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:54 compute-1 nova_compute[187078]: 2025-11-24 13:47:54.754 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:57 compute-1 podman[220112]: 2025-11-24 13:47:57.527340038 +0000 UTC m=+0.066331445 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, io.buildah.version=1.33.7)
Nov 24 13:47:59 compute-1 nova_compute[187078]: 2025-11-24 13:47:59.359 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:47:59 compute-1 nova_compute[187078]: 2025-11-24 13:47:59.755 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:01 compute-1 sshd-session[220133]: Invalid user git from 80.94.95.116 port 31388
Nov 24 13:48:01 compute-1 sshd-session[220133]: Connection closed by invalid user git 80.94.95.116 port 31388 [preauth]
Nov 24 13:48:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:48:04.176 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:48:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:48:04.177 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:48:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:48:04.177 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:48:04 compute-1 nova_compute[187078]: 2025-11-24 13:48:04.362 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:04 compute-1 nova_compute[187078]: 2025-11-24 13:48:04.758 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:05 compute-1 podman[197429]: time="2025-11-24T13:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:48:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:48:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 24 13:48:09 compute-1 nova_compute[187078]: 2025-11-24 13:48:09.364 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:09 compute-1 nova_compute[187078]: 2025-11-24 13:48:09.758 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:11 compute-1 podman[220137]: 2025-11-24 13:48:11.507907735 +0000 UTC m=+0.053778684 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:48:11 compute-1 podman[220138]: 2025-11-24 13:48:11.513589499 +0000 UTC m=+0.055555352 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:48:14 compute-1 nova_compute[187078]: 2025-11-24 13:48:14.366 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:14 compute-1 nova_compute[187078]: 2025-11-24 13:48:14.759 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:16 compute-1 ovn_controller[95368]: 2025-11-24T13:48:16Z|00251|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Nov 24 13:48:16 compute-1 podman[220178]: 2025-11-24 13:48:16.513744398 +0000 UTC m=+0.060402535 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 13:48:16 compute-1 podman[220179]: 2025-11-24 13:48:16.539043352 +0000 UTC m=+0.081234608 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:48:17 compute-1 sshd-session[220135]: Received disconnect from 45.78.217.131 port 48060:11: Bye Bye [preauth]
Nov 24 13:48:17 compute-1 sshd-session[220135]: Disconnected from 45.78.217.131 port 48060 [preauth]
Nov 24 13:48:19 compute-1 nova_compute[187078]: 2025-11-24 13:48:19.368 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:19 compute-1 openstack_network_exporter[199599]: ERROR   13:48:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:48:19 compute-1 openstack_network_exporter[199599]: ERROR   13:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:48:19 compute-1 openstack_network_exporter[199599]: ERROR   13:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:48:19 compute-1 openstack_network_exporter[199599]: ERROR   13:48:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:48:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:48:19 compute-1 openstack_network_exporter[199599]: ERROR   13:48:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:48:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:48:19 compute-1 nova_compute[187078]: 2025-11-24 13:48:19.795 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:24 compute-1 nova_compute[187078]: 2025-11-24 13:48:24.371 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:24 compute-1 nova_compute[187078]: 2025-11-24 13:48:24.796 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:27 compute-1 nova_compute[187078]: 2025-11-24 13:48:27.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:28 compute-1 podman[220224]: 2025-11-24 13:48:28.522744438 +0000 UTC m=+0.074903677 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Nov 24 13:48:29 compute-1 nova_compute[187078]: 2025-11-24 13:48:29.372 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:29 compute-1 nova_compute[187078]: 2025-11-24 13:48:29.665 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:29 compute-1 nova_compute[187078]: 2025-11-24 13:48:29.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:29 compute-1 nova_compute[187078]: 2025-11-24 13:48:29.855 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:31 compute-1 nova_compute[187078]: 2025-11-24 13:48:31.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:32 compute-1 nova_compute[187078]: 2025-11-24 13:48:32.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:34 compute-1 nova_compute[187078]: 2025-11-24 13:48:34.374 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:34 compute-1 nova_compute[187078]: 2025-11-24 13:48:34.856 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:35 compute-1 podman[197429]: time="2025-11-24T13:48:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:48:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:48:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:48:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:48:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 24 13:48:36 compute-1 nova_compute[187078]: 2025-11-24 13:48:36.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:36 compute-1 nova_compute[187078]: 2025-11-24 13:48:36.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.687 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.688 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.688 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.820 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.821 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5874MB free_disk=73.45540237426758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.822 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.822 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.884 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.884 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.918 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.931 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.932 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:48:37 compute-1 nova_compute[187078]: 2025-11-24 13:48:37.932 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:48:38 compute-1 nova_compute[187078]: 2025-11-24 13:48:38.932 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:38 compute-1 nova_compute[187078]: 2025-11-24 13:48:38.933 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:48:38 compute-1 nova_compute[187078]: 2025-11-24 13:48:38.933 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:48:38 compute-1 nova_compute[187078]: 2025-11-24 13:48:38.949 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:48:39 compute-1 nova_compute[187078]: 2025-11-24 13:48:39.376 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:39 compute-1 nova_compute[187078]: 2025-11-24 13:48:39.858 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:41 compute-1 nova_compute[187078]: 2025-11-24 13:48:41.676 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:42 compute-1 sshd-session[220248]: Accepted publickey for zuul from 192.168.122.10 port 54904 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:48:42 compute-1 systemd-logind[815]: New session 49 of user zuul.
Nov 24 13:48:42 compute-1 systemd[1]: Started Session 49 of User zuul.
Nov 24 13:48:42 compute-1 sshd-session[220248]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:48:42 compute-1 podman[220252]: 2025-11-24 13:48:42.249455515 +0000 UTC m=+0.043307671 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:48:42 compute-1 podman[220250]: 2025-11-24 13:48:42.250838343 +0000 UTC m=+0.051115664 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:48:42 compute-1 sudo[220295]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 24 13:48:42 compute-1 sudo[220295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:48:43 compute-1 nova_compute[187078]: 2025-11-24 13:48:43.659 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:48:44 compute-1 nova_compute[187078]: 2025-11-24 13:48:44.378 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:44 compute-1 nova_compute[187078]: 2025-11-24 13:48:44.860 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:46 compute-1 ovs-vsctl[220465]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 24 13:48:47 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220319 (sos)
Nov 24 13:48:47 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 24 13:48:47 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 24 13:48:47 compute-1 podman[220513]: 2025-11-24 13:48:47.397314757 +0000 UTC m=+0.069534071 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 24 13:48:47 compute-1 podman[220514]: 2025-11-24 13:48:47.451174813 +0000 UTC m=+0.122955055 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 13:48:47 compute-1 virtqemud[186628]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 24 13:48:47 compute-1 virtqemud[186628]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 24 13:48:47 compute-1 virtqemud[186628]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 13:48:48 compute-1 crontab[220926]: (root) LIST (root)
Nov 24 13:48:49 compute-1 nova_compute[187078]: 2025-11-24 13:48:49.380 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:49 compute-1 openstack_network_exporter[199599]: ERROR   13:48:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:48:49 compute-1 openstack_network_exporter[199599]: ERROR   13:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:48:49 compute-1 openstack_network_exporter[199599]: ERROR   13:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:48:49 compute-1 openstack_network_exporter[199599]: ERROR   13:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:48:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:48:49 compute-1 openstack_network_exporter[199599]: ERROR   13:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:48:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:48:49 compute-1 nova_compute[187078]: 2025-11-24 13:48:49.863 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:51 compute-1 systemd[1]: Starting Hostname Service...
Nov 24 13:48:51 compute-1 systemd[1]: Started Hostname Service.
Nov 24 13:48:54 compute-1 nova_compute[187078]: 2025-11-24 13:48:54.382 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:54 compute-1 nova_compute[187078]: 2025-11-24 13:48:54.908 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:55 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 24 13:48:58 compute-1 ovs-appctl[222151]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 13:48:58 compute-1 ovs-appctl[222161]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 13:48:58 compute-1 ovs-appctl[222171]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 13:48:59 compute-1 nova_compute[187078]: 2025-11-24 13:48:59.385 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:48:59 compute-1 podman[222552]: 2025-11-24 13:48:59.532696274 +0000 UTC m=+0.074383692 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 13:48:59 compute-1 nova_compute[187078]: 2025-11-24 13:48:59.909 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:49:04.177 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:49:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:49:04.178 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:49:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:49:04.178 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:49:04 compute-1 nova_compute[187078]: 2025-11-24 13:49:04.391 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:04 compute-1 nova_compute[187078]: 2025-11-24 13:49:04.910 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:05 compute-1 podman[197429]: time="2025-11-24T13:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:49:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:49:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 24 13:49:06 compute-1 virtqemud[186628]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 13:49:08 compute-1 systemd[1]: Starting Time & Date Service...
Nov 24 13:49:08 compute-1 systemd[1]: Started Time & Date Service.
Nov 24 13:49:09 compute-1 nova_compute[187078]: 2025-11-24 13:49:09.393 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:09 compute-1 nova_compute[187078]: 2025-11-24 13:49:09.945 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:12 compute-1 podman[223604]: 2025-11-24 13:49:12.527679468 +0000 UTC m=+0.068574124 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:49:12 compute-1 podman[223605]: 2025-11-24 13:49:12.539655123 +0000 UTC m=+0.085824832 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:49:14 compute-1 nova_compute[187078]: 2025-11-24 13:49:14.396 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:14 compute-1 nova_compute[187078]: 2025-11-24 13:49:14.949 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:17 compute-1 podman[223646]: 2025-11-24 13:49:17.533739948 +0000 UTC m=+0.078101472 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd)
Nov 24 13:49:17 compute-1 podman[223667]: 2025-11-24 13:49:17.669605721 +0000 UTC m=+0.108924546 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:49:19 compute-1 nova_compute[187078]: 2025-11-24 13:49:19.399 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:19 compute-1 openstack_network_exporter[199599]: ERROR   13:49:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:49:19 compute-1 openstack_network_exporter[199599]: ERROR   13:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:49:19 compute-1 openstack_network_exporter[199599]: ERROR   13:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:49:19 compute-1 openstack_network_exporter[199599]: ERROR   13:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:49:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:49:19 compute-1 openstack_network_exporter[199599]: ERROR   13:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:49:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:49:19 compute-1 nova_compute[187078]: 2025-11-24 13:49:19.950 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:24 compute-1 nova_compute[187078]: 2025-11-24 13:49:24.400 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:24 compute-1 sshd-session[223694]: Invalid user zjw from 175.100.24.139 port 59440
Nov 24 13:49:24 compute-1 sshd-session[223694]: Received disconnect from 175.100.24.139 port 59440:11: Bye Bye [preauth]
Nov 24 13:49:24 compute-1 sshd-session[223694]: Disconnected from invalid user zjw 175.100.24.139 port 59440 [preauth]
Nov 24 13:49:25 compute-1 nova_compute[187078]: 2025-11-24 13:49:25.000 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:27 compute-1 sudo[220295]: pam_unix(sudo:session): session closed for user root
Nov 24 13:49:27 compute-1 sshd-session[220267]: Received disconnect from 192.168.122.10 port 54904:11: disconnected by user
Nov 24 13:49:27 compute-1 sshd-session[220267]: Disconnected from user zuul 192.168.122.10 port 54904
Nov 24 13:49:27 compute-1 sshd-session[220248]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:49:27 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Nov 24 13:49:27 compute-1 systemd[1]: session-49.scope: Consumed 1min 18.580s CPU time, 483.1M memory peak, read 101.0M from disk, written 29.7M to disk.
Nov 24 13:49:27 compute-1 systemd-logind[815]: Session 49 logged out. Waiting for processes to exit.
Nov 24 13:49:27 compute-1 systemd-logind[815]: Removed session 49.
Nov 24 13:49:27 compute-1 sshd-session[223696]: Accepted publickey for zuul from 192.168.122.10 port 37436 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:49:27 compute-1 systemd-logind[815]: New session 50 of user zuul.
Nov 24 13:49:27 compute-1 systemd[1]: Started Session 50 of User zuul.
Nov 24 13:49:27 compute-1 sshd-session[223696]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:49:27 compute-1 sudo[223700]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2025-11-24-fgchplo.tar.xz
Nov 24 13:49:28 compute-1 sudo[223700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:49:28 compute-1 sudo[223700]: pam_unix(sudo:session): session closed for user root
Nov 24 13:49:28 compute-1 sshd-session[223699]: Received disconnect from 192.168.122.10 port 37436:11: disconnected by user
Nov 24 13:49:28 compute-1 sshd-session[223699]: Disconnected from user zuul 192.168.122.10 port 37436
Nov 24 13:49:28 compute-1 sshd-session[223696]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:49:28 compute-1 systemd[1]: session-50.scope: Deactivated successfully.
Nov 24 13:49:28 compute-1 systemd-logind[815]: Session 50 logged out. Waiting for processes to exit.
Nov 24 13:49:28 compute-1 systemd-logind[815]: Removed session 50.
Nov 24 13:49:28 compute-1 nova_compute[187078]: 2025-11-24 13:49:28.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:28 compute-1 sshd-session[223725]: Accepted publickey for zuul from 192.168.122.10 port 37440 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:49:28 compute-1 systemd-logind[815]: New session 51 of user zuul.
Nov 24 13:49:28 compute-1 systemd[1]: Started Session 51 of User zuul.
Nov 24 13:49:28 compute-1 sshd-session[223725]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:49:29 compute-1 sudo[223729]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Nov 24 13:49:29 compute-1 sudo[223729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:49:29 compute-1 sudo[223729]: pam_unix(sudo:session): session closed for user root
Nov 24 13:49:29 compute-1 sshd-session[223728]: Received disconnect from 192.168.122.10 port 37440:11: disconnected by user
Nov 24 13:49:29 compute-1 sshd-session[223728]: Disconnected from user zuul 192.168.122.10 port 37440
Nov 24 13:49:29 compute-1 sshd-session[223725]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:49:29 compute-1 systemd-logind[815]: Session 51 logged out. Waiting for processes to exit.
Nov 24 13:49:29 compute-1 systemd[1]: session-51.scope: Deactivated successfully.
Nov 24 13:49:29 compute-1 systemd-logind[815]: Removed session 51.
Nov 24 13:49:29 compute-1 nova_compute[187078]: 2025-11-24 13:49:29.401 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:29 compute-1 nova_compute[187078]: 2025-11-24 13:49:29.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:30 compute-1 nova_compute[187078]: 2025-11-24 13:49:30.001 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:30 compute-1 podman[223754]: 2025-11-24 13:49:30.511697103 +0000 UTC m=+0.059582881 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 13:49:31 compute-1 nova_compute[187078]: 2025-11-24 13:49:31.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:32 compute-1 nova_compute[187078]: 2025-11-24 13:49:32.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:32 compute-1 nova_compute[187078]: 2025-11-24 13:49:32.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:34 compute-1 nova_compute[187078]: 2025-11-24 13:49:34.403 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:35 compute-1 nova_compute[187078]: 2025-11-24 13:49:35.005 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:35 compute-1 podman[197429]: time="2025-11-24T13:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:49:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:49:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Nov 24 13:49:36 compute-1 nova_compute[187078]: 2025-11-24 13:49:36.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:36 compute-1 nova_compute[187078]: 2025-11-24 13:49:36.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:49:38 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 13:49:38 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.405 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.694 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.694 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.874 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.876 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5784MB free_disk=73.45523452758789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.876 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.876 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.928 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.928 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.942 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.957 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.957 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.968 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:49:39 compute-1 nova_compute[187078]: 2025-11-24 13:49:39.987 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:49:40 compute-1 nova_compute[187078]: 2025-11-24 13:49:40.008 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:49:40 compute-1 nova_compute[187078]: 2025-11-24 13:49:40.009 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:40 compute-1 nova_compute[187078]: 2025-11-24 13:49:40.024 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:49:40 compute-1 nova_compute[187078]: 2025-11-24 13:49:40.025 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:49:40 compute-1 nova_compute[187078]: 2025-11-24 13:49:40.026 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:49:41 compute-1 nova_compute[187078]: 2025-11-24 13:49:41.027 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:41 compute-1 nova_compute[187078]: 2025-11-24 13:49:41.027 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:49:41 compute-1 nova_compute[187078]: 2025-11-24 13:49:41.027 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:49:41 compute-1 nova_compute[187078]: 2025-11-24 13:49:41.044 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:49:43 compute-1 podman[223779]: 2025-11-24 13:49:43.520905773 +0000 UTC m=+0.062068929 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:49:43 compute-1 podman[223780]: 2025-11-24 13:49:43.520950304 +0000 UTC m=+0.061563215 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 13:49:43 compute-1 nova_compute[187078]: 2025-11-24 13:49:43.676 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:44 compute-1 nova_compute[187078]: 2025-11-24 13:49:44.407 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:45 compute-1 nova_compute[187078]: 2025-11-24 13:49:45.009 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:48 compute-1 podman[223822]: 2025-11-24 13:49:48.528214005 +0000 UTC m=+0.071811951 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 13:49:48 compute-1 podman[223823]: 2025-11-24 13:49:48.55128205 +0000 UTC m=+0.089984774 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Nov 24 13:49:49 compute-1 nova_compute[187078]: 2025-11-24 13:49:49.409 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:49 compute-1 openstack_network_exporter[199599]: ERROR   13:49:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:49:49 compute-1 openstack_network_exporter[199599]: ERROR   13:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:49:49 compute-1 openstack_network_exporter[199599]: ERROR   13:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:49:49 compute-1 openstack_network_exporter[199599]: ERROR   13:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:49:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:49:49 compute-1 openstack_network_exporter[199599]: ERROR   13:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:49:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:49:50 compute-1 nova_compute[187078]: 2025-11-24 13:49:50.012 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:54 compute-1 nova_compute[187078]: 2025-11-24 13:49:54.410 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:55 compute-1 nova_compute[187078]: 2025-11-24 13:49:55.051 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:49:57 compute-1 sshd-session[223868]: Invalid user solv from 45.148.10.240 port 48190
Nov 24 13:49:57 compute-1 sshd-session[223868]: Connection closed by invalid user solv 45.148.10.240 port 48190 [preauth]
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.667 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.668 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.668 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.668 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.668 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.669 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.687 187082 DEBUG nova.virt.libvirt.imagecache [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.687 187082 WARNING nova.virt.libvirt.imagecache [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.687 187082 INFO nova.virt.libvirt.imagecache [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Removable base files: /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.688 187082 INFO nova.virt.libvirt.imagecache [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/4a622edf34d6c396497a8622355dd999c6ac487f
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.688 187082 DEBUG nova.virt.libvirt.imagecache [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.688 187082 DEBUG nova.virt.libvirt.imagecache [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Nov 24 13:49:57 compute-1 nova_compute[187078]: 2025-11-24 13:49:57.688 187082 DEBUG nova.virt.libvirt.imagecache [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Nov 24 13:49:59 compute-1 nova_compute[187078]: 2025-11-24 13:49:59.411 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:00 compute-1 nova_compute[187078]: 2025-11-24 13:50:00.055 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:01 compute-1 anacron[94740]: Job `cron.weekly' started
Nov 24 13:50:01 compute-1 anacron[94740]: Job `cron.weekly' terminated
Nov 24 13:50:01 compute-1 podman[223872]: 2025-11-24 13:50:01.568253238 +0000 UTC m=+0.089034038 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:50:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:50:04.179 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:50:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:50:04.181 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:50:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:50:04.181 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:50:04 compute-1 nova_compute[187078]: 2025-11-24 13:50:04.414 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:05 compute-1 nova_compute[187078]: 2025-11-24 13:50:05.055 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:05 compute-1 podman[197429]: time="2025-11-24T13:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:50:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:50:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 24 13:50:09 compute-1 nova_compute[187078]: 2025-11-24 13:50:09.415 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:10 compute-1 nova_compute[187078]: 2025-11-24 13:50:10.057 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:14 compute-1 nova_compute[187078]: 2025-11-24 13:50:14.417 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:14 compute-1 podman[223895]: 2025-11-24 13:50:14.51467993 +0000 UTC m=+0.058948695 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 13:50:14 compute-1 podman[223894]: 2025-11-24 13:50:14.524232377 +0000 UTC m=+0.070662231 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:50:15 compute-1 nova_compute[187078]: 2025-11-24 13:50:15.059 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:19 compute-1 openstack_network_exporter[199599]: ERROR   13:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:50:19 compute-1 openstack_network_exporter[199599]: ERROR   13:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:50:19 compute-1 openstack_network_exporter[199599]: ERROR   13:50:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:50:19 compute-1 openstack_network_exporter[199599]: ERROR   13:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:50:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:50:19 compute-1 openstack_network_exporter[199599]: ERROR   13:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:50:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:50:19 compute-1 nova_compute[187078]: 2025-11-24 13:50:19.420 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:19 compute-1 podman[223936]: 2025-11-24 13:50:19.511122259 +0000 UTC m=+0.056803746 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 13:50:19 compute-1 podman[223937]: 2025-11-24 13:50:19.535514279 +0000 UTC m=+0.076790816 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 24 13:50:20 compute-1 nova_compute[187078]: 2025-11-24 13:50:20.091 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:24 compute-1 nova_compute[187078]: 2025-11-24 13:50:24.421 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:25 compute-1 nova_compute[187078]: 2025-11-24 13:50:25.093 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:29 compute-1 nova_compute[187078]: 2025-11-24 13:50:29.423 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:29 compute-1 nova_compute[187078]: 2025-11-24 13:50:29.688 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:30 compute-1 nova_compute[187078]: 2025-11-24 13:50:30.095 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:30 compute-1 nova_compute[187078]: 2025-11-24 13:50:30.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:30 compute-1 nova_compute[187078]: 2025-11-24 13:50:30.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:32 compute-1 podman[223984]: 2025-11-24 13:50:32.513155714 +0000 UTC m=+0.060228429 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Nov 24 13:50:32 compute-1 sshd-session[223982]: Invalid user sol from 193.32.162.146 port 41532
Nov 24 13:50:32 compute-1 sshd-session[223982]: Connection closed by invalid user sol 193.32.162.146 port 41532 [preauth]
Nov 24 13:50:33 compute-1 nova_compute[187078]: 2025-11-24 13:50:33.678 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:33 compute-1 nova_compute[187078]: 2025-11-24 13:50:33.679 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:33 compute-1 nova_compute[187078]: 2025-11-24 13:50:33.679 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:34 compute-1 nova_compute[187078]: 2025-11-24 13:50:34.463 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:35 compute-1 nova_compute[187078]: 2025-11-24 13:50:35.097 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:35 compute-1 podman[197429]: time="2025-11-24T13:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:50:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:50:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Nov 24 13:50:36 compute-1 nova_compute[187078]: 2025-11-24 13:50:36.427 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:36 compute-1 nova_compute[187078]: 2025-11-24 13:50:36.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:36 compute-1 nova_compute[187078]: 2025-11-24 13:50:36.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:50:38 compute-1 sshd-session[224005]: Received disconnect from 45.78.217.131 port 40232:11: Bye Bye [preauth]
Nov 24 13:50:38 compute-1 sshd-session[224005]: Disconnected from authenticating user root 45.78.217.131 port 40232 [preauth]
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.465 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.692 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.692 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.829 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.829 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5862MB free_disk=73.45519638061523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.830 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.830 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.950 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.950 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.976 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.990 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.991 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:50:39 compute-1 nova_compute[187078]: 2025-11-24 13:50:39.992 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:50:40 compute-1 nova_compute[187078]: 2025-11-24 13:50:40.099 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:40 compute-1 nova_compute[187078]: 2025-11-24 13:50:40.992 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:40 compute-1 nova_compute[187078]: 2025-11-24 13:50:40.993 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:50:40 compute-1 nova_compute[187078]: 2025-11-24 13:50:40.993 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:50:41 compute-1 nova_compute[187078]: 2025-11-24 13:50:41.007 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:50:44 compute-1 nova_compute[187078]: 2025-11-24 13:50:44.466 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:45 compute-1 nova_compute[187078]: 2025-11-24 13:50:45.101 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:45 compute-1 podman[224008]: 2025-11-24 13:50:45.526723893 +0000 UTC m=+0.051311568 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 13:50:45 compute-1 podman[224007]: 2025-11-24 13:50:45.540311411 +0000 UTC m=+0.065153612 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:50:45 compute-1 nova_compute[187078]: 2025-11-24 13:50:45.675 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:48 compute-1 nova_compute[187078]: 2025-11-24 13:50:48.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:49 compute-1 openstack_network_exporter[199599]: ERROR   13:50:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:50:49 compute-1 openstack_network_exporter[199599]: ERROR   13:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:50:49 compute-1 openstack_network_exporter[199599]: ERROR   13:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:50:49 compute-1 openstack_network_exporter[199599]: ERROR   13:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:50:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:50:49 compute-1 openstack_network_exporter[199599]: ERROR   13:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:50:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:50:49 compute-1 nova_compute[187078]: 2025-11-24 13:50:49.469 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:50 compute-1 nova_compute[187078]: 2025-11-24 13:50:50.103 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:50 compute-1 podman[224048]: 2025-11-24 13:50:50.512936676 +0000 UTC m=+0.059715865 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 24 13:50:50 compute-1 podman[224049]: 2025-11-24 13:50:50.579552777 +0000 UTC m=+0.119162173 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 24 13:50:54 compute-1 nova_compute[187078]: 2025-11-24 13:50:54.470 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:54 compute-1 nova_compute[187078]: 2025-11-24 13:50:54.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:54 compute-1 nova_compute[187078]: 2025-11-24 13:50:54.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:50:54 compute-1 nova_compute[187078]: 2025-11-24 13:50:54.677 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:50:55 compute-1 nova_compute[187078]: 2025-11-24 13:50:55.104 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:59 compute-1 nova_compute[187078]: 2025-11-24 13:50:59.472 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:50:59 compute-1 nova_compute[187078]: 2025-11-24 13:50:59.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:50:59 compute-1 nova_compute[187078]: 2025-11-24 13:50:59.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:51:00 compute-1 nova_compute[187078]: 2025-11-24 13:51:00.106 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:03 compute-1 podman[224093]: 2025-11-24 13:51:03.496622666 +0000 UTC m=+0.049197612 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, distribution-scope=public)
Nov 24 13:51:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:51:04.179 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:51:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:51:04.180 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:51:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:51:04.180 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:51:04 compute-1 nova_compute[187078]: 2025-11-24 13:51:04.474 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:05 compute-1 nova_compute[187078]: 2025-11-24 13:51:05.107 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:05 compute-1 podman[197429]: time="2025-11-24T13:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:51:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:51:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Nov 24 13:51:06 compute-1 sshd-session[224114]: Invalid user saas from 175.100.24.139 port 33392
Nov 24 13:51:06 compute-1 sshd-session[224114]: Received disconnect from 175.100.24.139 port 33392:11: Bye Bye [preauth]
Nov 24 13:51:06 compute-1 sshd-session[224114]: Disconnected from invalid user saas 175.100.24.139 port 33392 [preauth]
Nov 24 13:51:09 compute-1 nova_compute[187078]: 2025-11-24 13:51:09.477 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:10 compute-1 nova_compute[187078]: 2025-11-24 13:51:10.108 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:14 compute-1 nova_compute[187078]: 2025-11-24 13:51:14.478 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:15 compute-1 nova_compute[187078]: 2025-11-24 13:51:15.111 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:16 compute-1 podman[224116]: 2025-11-24 13:51:16.544006026 +0000 UTC m=+0.080194639 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:51:16 compute-1 podman[224117]: 2025-11-24 13:51:16.54895273 +0000 UTC m=+0.080553059 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 24 13:51:19 compute-1 openstack_network_exporter[199599]: ERROR   13:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:51:19 compute-1 openstack_network_exporter[199599]: ERROR   13:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:51:19 compute-1 openstack_network_exporter[199599]: ERROR   13:51:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:51:19 compute-1 openstack_network_exporter[199599]: ERROR   13:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:51:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:51:19 compute-1 openstack_network_exporter[199599]: ERROR   13:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:51:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:51:19 compute-1 nova_compute[187078]: 2025-11-24 13:51:19.482 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:20 compute-1 nova_compute[187078]: 2025-11-24 13:51:20.120 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:21 compute-1 podman[224159]: 2025-11-24 13:51:21.521627556 +0000 UTC m=+0.069170371 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 24 13:51:21 compute-1 podman[224160]: 2025-11-24 13:51:21.63462074 +0000 UTC m=+0.166705127 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller)
Nov 24 13:51:24 compute-1 nova_compute[187078]: 2025-11-24 13:51:24.487 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:25 compute-1 nova_compute[187078]: 2025-11-24 13:51:25.173 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:29 compute-1 nova_compute[187078]: 2025-11-24 13:51:29.489 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:30 compute-1 nova_compute[187078]: 2025-11-24 13:51:30.175 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:31 compute-1 nova_compute[187078]: 2025-11-24 13:51:31.676 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:31 compute-1 nova_compute[187078]: 2025-11-24 13:51:31.677 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:33 compute-1 nova_compute[187078]: 2025-11-24 13:51:33.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:34 compute-1 nova_compute[187078]: 2025-11-24 13:51:34.491 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:34 compute-1 podman[224205]: 2025-11-24 13:51:34.540628415 +0000 UTC m=+0.092570226 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41)
Nov 24 13:51:35 compute-1 nova_compute[187078]: 2025-11-24 13:51:35.177 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:35 compute-1 podman[197429]: time="2025-11-24T13:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:51:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:51:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 24 13:51:35 compute-1 nova_compute[187078]: 2025-11-24 13:51:35.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:35 compute-1 nova_compute[187078]: 2025-11-24 13:51:35.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:36 compute-1 nova_compute[187078]: 2025-11-24 13:51:36.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:36 compute-1 nova_compute[187078]: 2025-11-24 13:51:36.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.492 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.695 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.695 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.695 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.695 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.822 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.823 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5861MB free_disk=73.4551773071289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.824 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.824 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.921 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.922 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.948 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.960 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.961 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:51:39 compute-1 nova_compute[187078]: 2025-11-24 13:51:39.962 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:51:40 compute-1 nova_compute[187078]: 2025-11-24 13:51:40.180 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:41 compute-1 nova_compute[187078]: 2025-11-24 13:51:41.962 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:41 compute-1 nova_compute[187078]: 2025-11-24 13:51:41.963 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:51:41 compute-1 nova_compute[187078]: 2025-11-24 13:51:41.963 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:51:41 compute-1 nova_compute[187078]: 2025-11-24 13:51:41.973 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:51:44 compute-1 nova_compute[187078]: 2025-11-24 13:51:44.495 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:45 compute-1 nova_compute[187078]: 2025-11-24 13:51:45.181 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:47 compute-1 podman[224227]: 2025-11-24 13:51:47.509912882 +0000 UTC m=+0.056463759 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 13:51:47 compute-1 podman[224228]: 2025-11-24 13:51:47.511229947 +0000 UTC m=+0.054420793 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 13:51:47 compute-1 nova_compute[187078]: 2025-11-24 13:51:47.670 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:51:49 compute-1 openstack_network_exporter[199599]: ERROR   13:51:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:51:49 compute-1 openstack_network_exporter[199599]: ERROR   13:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:51:49 compute-1 openstack_network_exporter[199599]: ERROR   13:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:51:49 compute-1 openstack_network_exporter[199599]: ERROR   13:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:51:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:51:49 compute-1 openstack_network_exporter[199599]: ERROR   13:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:51:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:51:49 compute-1 nova_compute[187078]: 2025-11-24 13:51:49.496 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:50 compute-1 nova_compute[187078]: 2025-11-24 13:51:50.186 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:52 compute-1 podman[224271]: 2025-11-24 13:51:52.531954019 +0000 UTC m=+0.073584193 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 24 13:51:52 compute-1 podman[224272]: 2025-11-24 13:51:52.612391305 +0000 UTC m=+0.139042223 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:51:54 compute-1 nova_compute[187078]: 2025-11-24 13:51:54.500 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:55 compute-1 nova_compute[187078]: 2025-11-24 13:51:55.184 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:51:59 compute-1 nova_compute[187078]: 2025-11-24 13:51:59.503 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:00 compute-1 nova_compute[187078]: 2025-11-24 13:52:00.186 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:52:04.181 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:52:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:52:04.182 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:52:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:52:04.182 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:52:04 compute-1 nova_compute[187078]: 2025-11-24 13:52:04.507 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:05 compute-1 nova_compute[187078]: 2025-11-24 13:52:05.187 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:05 compute-1 podman[224317]: 2025-11-24 13:52:05.524076773 +0000 UTC m=+0.063019705 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal)
Nov 24 13:52:05 compute-1 podman[197429]: time="2025-11-24T13:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:52:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:52:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 24 13:52:09 compute-1 nova_compute[187078]: 2025-11-24 13:52:09.509 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:09 compute-1 sshd-session[224338]: Invalid user solv from 45.148.10.240 port 51632
Nov 24 13:52:09 compute-1 sshd-session[224338]: Connection closed by invalid user solv 45.148.10.240 port 51632 [preauth]
Nov 24 13:52:10 compute-1 nova_compute[187078]: 2025-11-24 13:52:10.252 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:14 compute-1 nova_compute[187078]: 2025-11-24 13:52:14.513 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:15 compute-1 nova_compute[187078]: 2025-11-24 13:52:15.282 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:18 compute-1 podman[224340]: 2025-11-24 13:52:18.507807072 +0000 UTC m=+0.053743925 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:52:18 compute-1 podman[224341]: 2025-11-24 13:52:18.524935435 +0000 UTC m=+0.060223940 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 24 13:52:19 compute-1 openstack_network_exporter[199599]: ERROR   13:52:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:52:19 compute-1 openstack_network_exporter[199599]: ERROR   13:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:52:19 compute-1 openstack_network_exporter[199599]: ERROR   13:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:52:19 compute-1 openstack_network_exporter[199599]: ERROR   13:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:52:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:52:19 compute-1 openstack_network_exporter[199599]: ERROR   13:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:52:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:52:19 compute-1 nova_compute[187078]: 2025-11-24 13:52:19.516 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:20 compute-1 nova_compute[187078]: 2025-11-24 13:52:20.316 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:23 compute-1 podman[224386]: 2025-11-24 13:52:23.564809005 +0000 UTC m=+0.095869796 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 13:52:23 compute-1 podman[224387]: 2025-11-24 13:52:23.605855126 +0000 UTC m=+0.132581229 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:52:24 compute-1 nova_compute[187078]: 2025-11-24 13:52:24.517 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:25 compute-1 nova_compute[187078]: 2025-11-24 13:52:25.369 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:29 compute-1 nova_compute[187078]: 2025-11-24 13:52:29.521 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:30 compute-1 nova_compute[187078]: 2025-11-24 13:52:30.416 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:30 compute-1 sshd-session[224384]: Invalid user deploy from 45.78.194.40 port 38178
Nov 24 13:52:31 compute-1 sshd-session[224384]: Received disconnect from 45.78.194.40 port 38178:11: Bye Bye [preauth]
Nov 24 13:52:31 compute-1 sshd-session[224384]: Disconnected from invalid user deploy 45.78.194.40 port 38178 [preauth]
Nov 24 13:52:32 compute-1 nova_compute[187078]: 2025-11-24 13:52:32.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:33 compute-1 nova_compute[187078]: 2025-11-24 13:52:33.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:34 compute-1 nova_compute[187078]: 2025-11-24 13:52:34.523 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:34 compute-1 nova_compute[187078]: 2025-11-24 13:52:34.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:35 compute-1 nova_compute[187078]: 2025-11-24 13:52:35.456 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:35 compute-1 podman[197429]: time="2025-11-24T13:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:52:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:52:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 24 13:52:36 compute-1 podman[224432]: 2025-11-24 13:52:36.521361777 +0000 UTC m=+0.069099100 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 13:52:36 compute-1 nova_compute[187078]: 2025-11-24 13:52:36.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:37 compute-1 nova_compute[187078]: 2025-11-24 13:52:37.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:38 compute-1 nova_compute[187078]: 2025-11-24 13:52:38.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:38 compute-1 nova_compute[187078]: 2025-11-24 13:52:38.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:52:39 compute-1 nova_compute[187078]: 2025-11-24 13:52:39.527 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.509 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.684 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.684 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.685 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.685 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.835 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.836 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5855MB free_disk=73.4552993774414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.837 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.837 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.883 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.884 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.904 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.925 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.926 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:52:40 compute-1 nova_compute[187078]: 2025-11-24 13:52:40.926 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:52:41 compute-1 nova_compute[187078]: 2025-11-24 13:52:41.927 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:41 compute-1 nova_compute[187078]: 2025-11-24 13:52:41.928 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:52:41 compute-1 nova_compute[187078]: 2025-11-24 13:52:41.928 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:52:41 compute-1 nova_compute[187078]: 2025-11-24 13:52:41.943 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:52:44 compute-1 nova_compute[187078]: 2025-11-24 13:52:44.531 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:45 compute-1 nova_compute[187078]: 2025-11-24 13:52:45.563 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:46 compute-1 sshd-session[224454]: Received disconnect from 193.46.255.244 port 47832:11:  [preauth]
Nov 24 13:52:46 compute-1 sshd-session[224454]: Disconnected from authenticating user root 193.46.255.244 port 47832 [preauth]
Nov 24 13:52:49 compute-1 openstack_network_exporter[199599]: ERROR   13:52:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:52:49 compute-1 openstack_network_exporter[199599]: ERROR   13:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:52:49 compute-1 openstack_network_exporter[199599]: ERROR   13:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:52:49 compute-1 openstack_network_exporter[199599]: ERROR   13:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:52:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:52:49 compute-1 openstack_network_exporter[199599]: ERROR   13:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:52:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:52:49 compute-1 podman[224456]: 2025-11-24 13:52:49.514610652 +0000 UTC m=+0.056958182 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:52:49 compute-1 podman[224457]: 2025-11-24 13:52:49.519766141 +0000 UTC m=+0.056890190 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:52:49 compute-1 nova_compute[187078]: 2025-11-24 13:52:49.532 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:49 compute-1 nova_compute[187078]: 2025-11-24 13:52:49.676 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:50 compute-1 nova_compute[187078]: 2025-11-24 13:52:50.565 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:53 compute-1 nova_compute[187078]: 2025-11-24 13:52:53.261 187082 DEBUG oslo_concurrency.processutils [None req-9a6dcc31-e28a-4373-9b95-6b77d19bfc05 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 13:52:53 compute-1 nova_compute[187078]: 2025-11-24 13:52:53.292 187082 DEBUG oslo_concurrency.processutils [None req-9a6dcc31-e28a-4373-9b95-6b77d19bfc05 f238bc60f1a24758958c8e5ab5e900ac 6c72cc3d41144d349210cde7f8024bbf - - default default] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 13:52:53 compute-1 nova_compute[187078]: 2025-11-24 13:52:53.662 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:52:54 compute-1 podman[224498]: 2025-11-24 13:52:54.517635394 +0000 UTC m=+0.058154964 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:52:54 compute-1 nova_compute[187078]: 2025-11-24 13:52:54.536 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:54 compute-1 podman[224499]: 2025-11-24 13:52:54.559558759 +0000 UTC m=+0.089749710 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 13:52:55 compute-1 nova_compute[187078]: 2025-11-24 13:52:55.596 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:52:59.137 104225 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:6c:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:c7:7b:9d:b6:1b'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 13:52:59 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:52:59.138 104225 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 13:52:59 compute-1 nova_compute[187078]: 2025-11-24 13:52:59.181 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:52:59 compute-1 nova_compute[187078]: 2025-11-24 13:52:59.538 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:00 compute-1 nova_compute[187078]: 2025-11-24 13:53:00.597 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:53:04.182 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:53:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:53:04.182 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:53:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:53:04.183 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:53:04 compute-1 nova_compute[187078]: 2025-11-24 13:53:04.542 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:05 compute-1 nova_compute[187078]: 2025-11-24 13:53:05.632 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:05 compute-1 podman[197429]: time="2025-11-24T13:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:53:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:53:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 24 13:53:07 compute-1 podman[224542]: 2025-11-24 13:53:07.545900268 +0000 UTC m=+0.079908764 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 13:53:08 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:53:08.140 104225 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=971456df-f9ba-4c8a-bc15-c9feb573d541, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 13:53:09 compute-1 nova_compute[187078]: 2025-11-24 13:53:09.544 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:10 compute-1 nova_compute[187078]: 2025-11-24 13:53:10.634 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:14 compute-1 nova_compute[187078]: 2025-11-24 13:53:14.548 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:15 compute-1 nova_compute[187078]: 2025-11-24 13:53:15.686 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:19 compute-1 openstack_network_exporter[199599]: ERROR   13:53:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:53:19 compute-1 openstack_network_exporter[199599]: ERROR   13:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:53:19 compute-1 openstack_network_exporter[199599]: ERROR   13:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:53:19 compute-1 openstack_network_exporter[199599]: ERROR   13:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:53:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:53:19 compute-1 openstack_network_exporter[199599]: ERROR   13:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:53:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:53:19 compute-1 nova_compute[187078]: 2025-11-24 13:53:19.549 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:20 compute-1 podman[224563]: 2025-11-24 13:53:20.535906256 +0000 UTC m=+0.070685784 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 13:53:20 compute-1 podman[224564]: 2025-11-24 13:53:20.538972968 +0000 UTC m=+0.066798298 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 13:53:20 compute-1 nova_compute[187078]: 2025-11-24 13:53:20.688 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:24 compute-1 nova_compute[187078]: 2025-11-24 13:53:24.553 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:25 compute-1 podman[224608]: 2025-11-24 13:53:25.534766066 +0000 UTC m=+0.067881617 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:53:25 compute-1 podman[224609]: 2025-11-24 13:53:25.604859753 +0000 UTC m=+0.126422812 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 24 13:53:25 compute-1 nova_compute[187078]: 2025-11-24 13:53:25.688 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:29 compute-1 nova_compute[187078]: 2025-11-24 13:53:29.555 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:30 compute-1 nova_compute[187078]: 2025-11-24 13:53:30.734 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:34 compute-1 nova_compute[187078]: 2025-11-24 13:53:34.558 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:34 compute-1 nova_compute[187078]: 2025-11-24 13:53:34.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:35 compute-1 podman[197429]: time="2025-11-24T13:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:53:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:53:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 24 13:53:35 compute-1 nova_compute[187078]: 2025-11-24 13:53:35.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:35 compute-1 nova_compute[187078]: 2025-11-24 13:53:35.735 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:36 compute-1 nova_compute[187078]: 2025-11-24 13:53:36.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:38 compute-1 podman[224656]: 2025-11-24 13:53:38.511279879 +0000 UTC m=+0.059782529 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public)
Nov 24 13:53:38 compute-1 nova_compute[187078]: 2025-11-24 13:53:38.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:39 compute-1 nova_compute[187078]: 2025-11-24 13:53:39.562 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:39 compute-1 nova_compute[187078]: 2025-11-24 13:53:39.665 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:39 compute-1 nova_compute[187078]: 2025-11-24 13:53:39.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:39 compute-1 nova_compute[187078]: 2025-11-24 13:53:39.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:53:40 compute-1 nova_compute[187078]: 2025-11-24 13:53:40.777 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.684 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.685 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.685 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.686 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.840 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.841 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5865MB free_disk=73.4552993774414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.842 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.842 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.911 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.912 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.981 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.993 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.995 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:53:41 compute-1 nova_compute[187078]: 2025-11-24 13:53:41.995 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:53:43 compute-1 nova_compute[187078]: 2025-11-24 13:53:43.994 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:43 compute-1 nova_compute[187078]: 2025-11-24 13:53:43.995 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:53:43 compute-1 nova_compute[187078]: 2025-11-24 13:53:43.995 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:53:44 compute-1 nova_compute[187078]: 2025-11-24 13:53:44.012 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:53:44 compute-1 nova_compute[187078]: 2025-11-24 13:53:44.566 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:45 compute-1 nova_compute[187078]: 2025-11-24 13:53:45.780 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:47 compute-1 sshd-session[224677]: Invalid user sol from 193.32.162.146 port 53298
Nov 24 13:53:47 compute-1 sshd-session[224677]: Connection closed by invalid user sol 193.32.162.146 port 53298 [preauth]
Nov 24 13:53:49 compute-1 openstack_network_exporter[199599]: ERROR   13:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:53:49 compute-1 openstack_network_exporter[199599]: ERROR   13:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:53:49 compute-1 openstack_network_exporter[199599]: ERROR   13:53:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:53:49 compute-1 openstack_network_exporter[199599]: ERROR   13:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:53:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:53:49 compute-1 openstack_network_exporter[199599]: ERROR   13:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:53:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:53:49 compute-1 nova_compute[187078]: 2025-11-24 13:53:49.567 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:50 compute-1 nova_compute[187078]: 2025-11-24 13:53:50.677 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:53:50 compute-1 nova_compute[187078]: 2025-11-24 13:53:50.782 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:51 compute-1 podman[224679]: 2025-11-24 13:53:51.501916097 +0000 UTC m=+0.053077308 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:53:51 compute-1 podman[224680]: 2025-11-24 13:53:51.506161341 +0000 UTC m=+0.053936180 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:53:54 compute-1 nova_compute[187078]: 2025-11-24 13:53:54.570 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:55 compute-1 nova_compute[187078]: 2025-11-24 13:53:55.783 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:53:56 compute-1 podman[224721]: 2025-11-24 13:53:56.505649806 +0000 UTC m=+0.053125358 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:53:56 compute-1 podman[224722]: 2025-11-24 13:53:56.532633237 +0000 UTC m=+0.076156203 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 13:53:59 compute-1 nova_compute[187078]: 2025-11-24 13:53:59.601 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:00 compute-1 nova_compute[187078]: 2025-11-24 13:54:00.826 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:54:04.184 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:54:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:54:04.184 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:54:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:54:04.185 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:54:04 compute-1 nova_compute[187078]: 2025-11-24 13:54:04.605 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:05 compute-1 podman[197429]: time="2025-11-24T13:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:54:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:54:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Nov 24 13:54:05 compute-1 nova_compute[187078]: 2025-11-24 13:54:05.829 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:09 compute-1 podman[224767]: 2025-11-24 13:54:09.495194671 +0000 UTC m=+0.047330512 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 13:54:09 compute-1 nova_compute[187078]: 2025-11-24 13:54:09.606 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:10 compute-1 nova_compute[187078]: 2025-11-24 13:54:10.831 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:14 compute-1 sshd-session[224789]: Invalid user solv from 45.148.10.240 port 36010
Nov 24 13:54:14 compute-1 sshd-session[224789]: Connection closed by invalid user solv 45.148.10.240 port 36010 [preauth]
Nov 24 13:54:14 compute-1 nova_compute[187078]: 2025-11-24 13:54:14.610 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:15 compute-1 nova_compute[187078]: 2025-11-24 13:54:15.833 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:19 compute-1 openstack_network_exporter[199599]: ERROR   13:54:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:54:19 compute-1 openstack_network_exporter[199599]: ERROR   13:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:54:19 compute-1 openstack_network_exporter[199599]: ERROR   13:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:54:19 compute-1 openstack_network_exporter[199599]: ERROR   13:54:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:54:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:54:19 compute-1 openstack_network_exporter[199599]: ERROR   13:54:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:54:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:54:19 compute-1 nova_compute[187078]: 2025-11-24 13:54:19.611 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:20 compute-1 nova_compute[187078]: 2025-11-24 13:54:20.884 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:22 compute-1 podman[224791]: 2025-11-24 13:54:22.502517046 +0000 UTC m=+0.048450862 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:54:22 compute-1 podman[224792]: 2025-11-24 13:54:22.532739654 +0000 UTC m=+0.074598090 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 13:54:24 compute-1 nova_compute[187078]: 2025-11-24 13:54:24.614 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:25 compute-1 nova_compute[187078]: 2025-11-24 13:54:25.887 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:27 compute-1 podman[224833]: 2025-11-24 13:54:27.514595315 +0000 UTC m=+0.063887500 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:54:27 compute-1 podman[224834]: 2025-11-24 13:54:27.536742374 +0000 UTC m=+0.085802213 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 13:54:29 compute-1 nova_compute[187078]: 2025-11-24 13:54:29.627 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:30 compute-1 nova_compute[187078]: 2025-11-24 13:54:30.926 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:34 compute-1 nova_compute[187078]: 2025-11-24 13:54:34.631 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:35 compute-1 podman[197429]: time="2025-11-24T13:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:54:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:54:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 24 13:54:35 compute-1 nova_compute[187078]: 2025-11-24 13:54:35.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:35 compute-1 nova_compute[187078]: 2025-11-24 13:54:35.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:35 compute-1 nova_compute[187078]: 2025-11-24 13:54:35.929 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:36 compute-1 nova_compute[187078]: 2025-11-24 13:54:36.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:38 compute-1 nova_compute[187078]: 2025-11-24 13:54:38.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:39 compute-1 nova_compute[187078]: 2025-11-24 13:54:39.634 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:40 compute-1 podman[224878]: 2025-11-24 13:54:40.527772558 +0000 UTC m=+0.069265064 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 24 13:54:40 compute-1 nova_compute[187078]: 2025-11-24 13:54:40.933 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.669 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.669 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.691 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.692 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.902 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.904 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5859MB free_disk=73.4552993774414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.904 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.905 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.963 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.964 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:54:41 compute-1 nova_compute[187078]: 2025-11-24 13:54:41.989 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing inventories for resource provider ece8f004-1d5b-407f-a713-f9e87706b045 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 13:54:42 compute-1 nova_compute[187078]: 2025-11-24 13:54:42.010 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating ProviderTree inventory for provider ece8f004-1d5b-407f-a713-f9e87706b045 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 13:54:42 compute-1 nova_compute[187078]: 2025-11-24 13:54:42.011 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Updating inventory in ProviderTree for provider ece8f004-1d5b-407f-a713-f9e87706b045 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 13:54:42 compute-1 nova_compute[187078]: 2025-11-24 13:54:42.030 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing aggregate associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 13:54:42 compute-1 nova_compute[187078]: 2025-11-24 13:54:42.061 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Refreshing trait associations for resource provider ece8f004-1d5b-407f-a713-f9e87706b045, traits: HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 13:54:42 compute-1 nova_compute[187078]: 2025-11-24 13:54:42.089 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:54:42 compute-1 nova_compute[187078]: 2025-11-24 13:54:42.109 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:54:42 compute-1 nova_compute[187078]: 2025-11-24 13:54:42.112 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:54:42 compute-1 nova_compute[187078]: 2025-11-24 13:54:42.113 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:54:44 compute-1 nova_compute[187078]: 2025-11-24 13:54:44.113 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:44 compute-1 nova_compute[187078]: 2025-11-24 13:54:44.114 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:54:44 compute-1 nova_compute[187078]: 2025-11-24 13:54:44.115 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:54:44 compute-1 nova_compute[187078]: 2025-11-24 13:54:44.131 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:54:44 compute-1 nova_compute[187078]: 2025-11-24 13:54:44.638 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:45 compute-1 nova_compute[187078]: 2025-11-24 13:54:45.935 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:49 compute-1 openstack_network_exporter[199599]: ERROR   13:54:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:54:49 compute-1 openstack_network_exporter[199599]: ERROR   13:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:54:49 compute-1 openstack_network_exporter[199599]: ERROR   13:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:54:49 compute-1 openstack_network_exporter[199599]: ERROR   13:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:54:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:54:49 compute-1 openstack_network_exporter[199599]: ERROR   13:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:54:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:54:49 compute-1 nova_compute[187078]: 2025-11-24 13:54:49.640 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:50 compute-1 nova_compute[187078]: 2025-11-24 13:54:50.678 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:50 compute-1 nova_compute[187078]: 2025-11-24 13:54:50.972 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:53 compute-1 podman[224902]: 2025-11-24 13:54:53.516913793 +0000 UTC m=+0.059320096 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:54:53 compute-1 podman[224903]: 2025-11-24 13:54:53.531727024 +0000 UTC m=+0.063372146 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 24 13:54:54 compute-1 nova_compute[187078]: 2025-11-24 13:54:54.645 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:55 compute-1 nova_compute[187078]: 2025-11-24 13:54:55.660 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:54:55 compute-1 nova_compute[187078]: 2025-11-24 13:54:55.974 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:58 compute-1 podman[224945]: 2025-11-24 13:54:58.548634004 +0000 UTC m=+0.083841040 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 24 13:54:58 compute-1 podman[224946]: 2025-11-24 13:54:58.598941225 +0000 UTC m=+0.126572656 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 24 13:54:59 compute-1 nova_compute[187078]: 2025-11-24 13:54:59.647 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:54:59 compute-1 sshd-session[224900]: Received disconnect from 45.78.194.40 port 48698:11: Bye Bye [preauth]
Nov 24 13:54:59 compute-1 sshd-session[224900]: Disconnected from 45.78.194.40 port 48698 [preauth]
Nov 24 13:55:01 compute-1 nova_compute[187078]: 2025-11-24 13:55:01.025 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:55:04.185 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:55:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:55:04.186 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:55:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:55:04.186 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:55:04 compute-1 nova_compute[187078]: 2025-11-24 13:55:04.650 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:05 compute-1 podman[197429]: time="2025-11-24T13:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:55:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:55:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 24 13:55:06 compute-1 nova_compute[187078]: 2025-11-24 13:55:06.026 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:09 compute-1 nova_compute[187078]: 2025-11-24 13:55:09.654 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:11 compute-1 nova_compute[187078]: 2025-11-24 13:55:11.026 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:11 compute-1 podman[224990]: 2025-11-24 13:55:11.512091032 +0000 UTC m=+0.063676644 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64)
Nov 24 13:55:14 compute-1 nova_compute[187078]: 2025-11-24 13:55:14.657 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:16 compute-1 nova_compute[187078]: 2025-11-24 13:55:16.027 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:19 compute-1 openstack_network_exporter[199599]: ERROR   13:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:55:19 compute-1 openstack_network_exporter[199599]: ERROR   13:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:55:19 compute-1 openstack_network_exporter[199599]: ERROR   13:55:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:55:19 compute-1 openstack_network_exporter[199599]: ERROR   13:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:55:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:55:19 compute-1 openstack_network_exporter[199599]: ERROR   13:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:55:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:55:19 compute-1 nova_compute[187078]: 2025-11-24 13:55:19.658 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:21 compute-1 nova_compute[187078]: 2025-11-24 13:55:21.029 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:24 compute-1 podman[225011]: 2025-11-24 13:55:24.502717926 +0000 UTC m=+0.053528929 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 13:55:24 compute-1 podman[225012]: 2025-11-24 13:55:24.534474775 +0000 UTC m=+0.081747503 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 13:55:24 compute-1 nova_compute[187078]: 2025-11-24 13:55:24.660 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:26 compute-1 nova_compute[187078]: 2025-11-24 13:55:26.029 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:29 compute-1 podman[225052]: 2025-11-24 13:55:29.511907706 +0000 UTC m=+0.059338637 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 13:55:29 compute-1 podman[225053]: 2025-11-24 13:55:29.555325631 +0000 UTC m=+0.098759084 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 13:55:29 compute-1 nova_compute[187078]: 2025-11-24 13:55:29.662 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:31 compute-1 nova_compute[187078]: 2025-11-24 13:55:31.031 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:34 compute-1 nova_compute[187078]: 2025-11-24 13:55:34.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:34 compute-1 nova_compute[187078]: 2025-11-24 13:55:34.667 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:35 compute-1 podman[197429]: time="2025-11-24T13:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:55:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:55:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 24 13:55:36 compute-1 nova_compute[187078]: 2025-11-24 13:55:36.035 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:36 compute-1 nova_compute[187078]: 2025-11-24 13:55:36.691 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:37 compute-1 nova_compute[187078]: 2025-11-24 13:55:37.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:37 compute-1 nova_compute[187078]: 2025-11-24 13:55:37.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:39 compute-1 nova_compute[187078]: 2025-11-24 13:55:39.671 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:40 compute-1 sshd-session[225096]: Invalid user zabbix from 45.78.217.131 port 44748
Nov 24 13:55:40 compute-1 sshd-session[225096]: Received disconnect from 45.78.217.131 port 44748:11: Bye Bye [preauth]
Nov 24 13:55:40 compute-1 sshd-session[225096]: Disconnected from invalid user zabbix 45.78.217.131 port 44748 [preauth]
Nov 24 13:55:40 compute-1 nova_compute[187078]: 2025-11-24 13:55:40.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:41 compute-1 nova_compute[187078]: 2025-11-24 13:55:41.038 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:42 compute-1 podman[225098]: 2025-11-24 13:55:42.494996747 +0000 UTC m=+0.045862972 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible)
Nov 24 13:55:42 compute-1 nova_compute[187078]: 2025-11-24 13:55:42.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.679 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.679 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.679 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.679 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.700 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.701 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.701 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.701 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.839 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.841 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5844MB free_disk=73.45531845092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.841 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.841 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.903 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.903 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:55:43 compute-1 nova_compute[187078]: 2025-11-24 13:55:43.992 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:55:44 compute-1 nova_compute[187078]: 2025-11-24 13:55:44.013 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:55:44 compute-1 nova_compute[187078]: 2025-11-24 13:55:44.014 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:55:44 compute-1 nova_compute[187078]: 2025-11-24 13:55:44.015 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:55:44 compute-1 nova_compute[187078]: 2025-11-24 13:55:44.704 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:46 compute-1 nova_compute[187078]: 2025-11-24 13:55:46.075 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:49 compute-1 openstack_network_exporter[199599]: ERROR   13:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:55:49 compute-1 openstack_network_exporter[199599]: ERROR   13:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:55:49 compute-1 openstack_network_exporter[199599]: ERROR   13:55:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:55:49 compute-1 openstack_network_exporter[199599]: ERROR   13:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:55:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:55:49 compute-1 openstack_network_exporter[199599]: ERROR   13:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:55:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:55:49 compute-1 nova_compute[187078]: 2025-11-24 13:55:49.761 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:51 compute-1 nova_compute[187078]: 2025-11-24 13:55:51.126 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:53 compute-1 nova_compute[187078]: 2025-11-24 13:55:53.007 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:55:54 compute-1 nova_compute[187078]: 2025-11-24 13:55:54.765 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:55 compute-1 podman[225119]: 2025-11-24 13:55:55.53581193 +0000 UTC m=+0.078086174 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 24 13:55:55 compute-1 podman[225118]: 2025-11-24 13:55:55.541951416 +0000 UTC m=+0.084990751 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:55:56 compute-1 nova_compute[187078]: 2025-11-24 13:55:56.176 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:55:59 compute-1 nova_compute[187078]: 2025-11-24 13:55:59.768 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:00 compute-1 podman[225161]: 2025-11-24 13:56:00.517933786 +0000 UTC m=+0.057063354 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 13:56:00 compute-1 podman[225162]: 2025-11-24 13:56:00.557828866 +0000 UTC m=+0.093710997 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:56:01 compute-1 nova_compute[187078]: 2025-11-24 13:56:01.178 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:03 compute-1 nova_compute[187078]: 2025-11-24 13:56:03.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:03 compute-1 nova_compute[187078]: 2025-11-24 13:56:03.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 13:56:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:56:04.187 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:56:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:56:04.187 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:56:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:56:04.187 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:56:04 compute-1 nova_compute[187078]: 2025-11-24 13:56:04.772 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:05 compute-1 podman[197429]: time="2025-11-24T13:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:56:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:56:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Nov 24 13:56:06 compute-1 nova_compute[187078]: 2025-11-24 13:56:06.184 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:06 compute-1 nova_compute[187078]: 2025-11-24 13:56:06.685 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:06 compute-1 nova_compute[187078]: 2025-11-24 13:56:06.686 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 13:56:06 compute-1 nova_compute[187078]: 2025-11-24 13:56:06.707 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 13:56:09 compute-1 nova_compute[187078]: 2025-11-24 13:56:09.811 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:11 compute-1 nova_compute[187078]: 2025-11-24 13:56:11.185 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:13 compute-1 podman[225209]: 2025-11-24 13:56:13.499593447 +0000 UTC m=+0.052104350 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 24 13:56:14 compute-1 sshd-session[225232]: banner exchange: Connection from 20.14.93.239 port 43518: invalid format
Nov 24 13:56:14 compute-1 nova_compute[187078]: 2025-11-24 13:56:14.852 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:16 compute-1 nova_compute[187078]: 2025-11-24 13:56:16.217 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:18 compute-1 sshd-session[225233]: Invalid user solv from 45.148.10.240 port 33758
Nov 24 13:56:18 compute-1 sshd-session[225233]: Connection closed by invalid user solv 45.148.10.240 port 33758 [preauth]
Nov 24 13:56:19 compute-1 openstack_network_exporter[199599]: ERROR   13:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:56:19 compute-1 openstack_network_exporter[199599]: ERROR   13:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:56:19 compute-1 openstack_network_exporter[199599]: ERROR   13:56:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:56:19 compute-1 openstack_network_exporter[199599]: ERROR   13:56:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:56:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:56:19 compute-1 openstack_network_exporter[199599]: ERROR   13:56:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:56:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:56:19 compute-1 nova_compute[187078]: 2025-11-24 13:56:19.853 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:21 compute-1 nova_compute[187078]: 2025-11-24 13:56:21.219 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:23 compute-1 sshd-session[225230]: Connection closed by 20.14.93.239 port 43504 [preauth]
Nov 24 13:56:24 compute-1 nova_compute[187078]: 2025-11-24 13:56:24.855 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:26 compute-1 nova_compute[187078]: 2025-11-24 13:56:26.221 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:26 compute-1 podman[225236]: 2025-11-24 13:56:26.517239216 +0000 UTC m=+0.052675017 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 24 13:56:26 compute-1 podman[225235]: 2025-11-24 13:56:26.527672638 +0000 UTC m=+0.070779366 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:56:29 compute-1 nova_compute[187078]: 2025-11-24 13:56:29.920 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:31 compute-1 nova_compute[187078]: 2025-11-24 13:56:31.224 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:31 compute-1 podman[225279]: 2025-11-24 13:56:31.509520156 +0000 UTC m=+0.061748502 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 24 13:56:31 compute-1 podman[225280]: 2025-11-24 13:56:31.551672676 +0000 UTC m=+0.093490290 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:56:34 compute-1 nova_compute[187078]: 2025-11-24 13:56:34.923 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:35 compute-1 podman[197429]: time="2025-11-24T13:56:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:56:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:56:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:56:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:56:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Nov 24 13:56:36 compute-1 nova_compute[187078]: 2025-11-24 13:56:36.225 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:36 compute-1 nova_compute[187078]: 2025-11-24 13:56:36.690 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:37 compute-1 nova_compute[187078]: 2025-11-24 13:56:37.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:39 compute-1 nova_compute[187078]: 2025-11-24 13:56:39.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:39 compute-1 nova_compute[187078]: 2025-11-24 13:56:39.925 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:41 compute-1 nova_compute[187078]: 2025-11-24 13:56:41.227 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:41 compute-1 nova_compute[187078]: 2025-11-24 13:56:41.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:44 compute-1 podman[225326]: 2025-11-24 13:56:44.509952565 +0000 UTC m=+0.059444059 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 24 13:56:44 compute-1 nova_compute[187078]: 2025-11-24 13:56:44.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:44 compute-1 nova_compute[187078]: 2025-11-24 13:56:44.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:56:44 compute-1 nova_compute[187078]: 2025-11-24 13:56:44.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:56:44 compute-1 nova_compute[187078]: 2025-11-24 13:56:44.685 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:56:44 compute-1 nova_compute[187078]: 2025-11-24 13:56:44.685 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:44 compute-1 nova_compute[187078]: 2025-11-24 13:56:44.929 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.684 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.685 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.685 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.685 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.812 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.813 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5847MB free_disk=73.45525741577148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.813 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.813 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.907 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.907 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.929 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.941 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.942 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:56:45 compute-1 nova_compute[187078]: 2025-11-24 13:56:45.942 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:56:46 compute-1 nova_compute[187078]: 2025-11-24 13:56:46.227 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:49 compute-1 openstack_network_exporter[199599]: ERROR   13:56:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:56:49 compute-1 openstack_network_exporter[199599]: ERROR   13:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:56:49 compute-1 openstack_network_exporter[199599]: ERROR   13:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:56:49 compute-1 openstack_network_exporter[199599]: ERROR   13:56:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:56:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:56:49 compute-1 openstack_network_exporter[199599]: ERROR   13:56:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:56:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:56:49 compute-1 nova_compute[187078]: 2025-11-24 13:56:49.931 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:51 compute-1 nova_compute[187078]: 2025-11-24 13:56:51.229 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:53 compute-1 nova_compute[187078]: 2025-11-24 13:56:53.936 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:56:54 compute-1 nova_compute[187078]: 2025-11-24 13:56:54.934 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:56 compute-1 nova_compute[187078]: 2025-11-24 13:56:56.230 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:56:57 compute-1 podman[225348]: 2025-11-24 13:56:57.501452913 +0000 UTC m=+0.049131571 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 13:56:57 compute-1 podman[225347]: 2025-11-24 13:56:57.500663121 +0000 UTC m=+0.052574554 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 13:56:59 compute-1 nova_compute[187078]: 2025-11-24 13:56:59.938 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:00 compute-1 nova_compute[187078]: 2025-11-24 13:57:00.659 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:01 compute-1 nova_compute[187078]: 2025-11-24 13:57:01.233 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:01 compute-1 sshd-session[225390]: Invalid user solana from 193.32.162.146 port 36824
Nov 24 13:57:02 compute-1 podman[225392]: 2025-11-24 13:57:02.071615992 +0000 UTC m=+0.066405328 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:57:02 compute-1 podman[225393]: 2025-11-24 13:57:02.104572044 +0000 UTC m=+0.095763203 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:57:02 compute-1 sshd-session[225390]: Connection closed by invalid user solana 193.32.162.146 port 36824 [preauth]
Nov 24 13:57:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:57:04.188 104225 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:57:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:57:04.191 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:57:04 compute-1 ovn_metadata_agent[104220]: 2025-11-24 13:57:04.192 104225 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:57:04 compute-1 nova_compute[187078]: 2025-11-24 13:57:04.942 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:05 compute-1 podman[197429]: time="2025-11-24T13:57:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:57:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:57:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:57:05 compute-1 podman[197429]: @ - - [24/Nov/2025:13:57:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 24 13:57:06 compute-1 nova_compute[187078]: 2025-11-24 13:57:06.233 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:09 compute-1 nova_compute[187078]: 2025-11-24 13:57:09.945 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:11 compute-1 nova_compute[187078]: 2025-11-24 13:57:11.235 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:14 compute-1 nova_compute[187078]: 2025-11-24 13:57:14.948 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:15 compute-1 podman[225438]: 2025-11-24 13:57:15.51098753 +0000 UTC m=+0.055524294 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Nov 24 13:57:16 compute-1 nova_compute[187078]: 2025-11-24 13:57:16.237 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:19 compute-1 openstack_network_exporter[199599]: ERROR   13:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:57:19 compute-1 openstack_network_exporter[199599]: ERROR   13:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:57:19 compute-1 openstack_network_exporter[199599]: ERROR   13:57:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:57:19 compute-1 openstack_network_exporter[199599]: ERROR   13:57:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:57:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:57:19 compute-1 openstack_network_exporter[199599]: ERROR   13:57:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:57:19 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:57:19 compute-1 nova_compute[187078]: 2025-11-24 13:57:19.950 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:21 compute-1 nova_compute[187078]: 2025-11-24 13:57:21.240 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:24 compute-1 nova_compute[187078]: 2025-11-24 13:57:24.955 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:26 compute-1 nova_compute[187078]: 2025-11-24 13:57:26.241 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:28 compute-1 podman[225460]: 2025-11-24 13:57:28.497292519 +0000 UTC m=+0.044483705 container health_status 72e8d52eb0a715a20cf6e99c393e23ea46b1b9ef39804fe7c049953271d25c2b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 13:57:28 compute-1 podman[225461]: 2025-11-24 13:57:28.530659922 +0000 UTC m=+0.071769243 container health_status a133cfb49c3362bd2f3f341b466b2ebe6f2552fac6012a23df0ee135913504d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 13:57:29 compute-1 nova_compute[187078]: 2025-11-24 13:57:29.959 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:31 compute-1 nova_compute[187078]: 2025-11-24 13:57:31.243 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:32 compute-1 sshd-session[225459]: Connection closed by 45.78.194.40 port 59332 [preauth]
Nov 24 13:57:32 compute-1 podman[225506]: 2025-11-24 13:57:32.522704085 +0000 UTC m=+0.064846785 container health_status 32da7bcb793f4ea47301c171a242d57dbc3bf2c50d89a53e5bce7924efcd4151 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 13:57:32 compute-1 podman[225507]: 2025-11-24 13:57:32.545529843 +0000 UTC m=+0.085328281 container health_status e3ea01f531f3df5df1ba18b2deadacb254979aa3b06ce7e77fc0627a00c1e321 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 24 13:57:34 compute-1 nova_compute[187078]: 2025-11-24 13:57:34.963 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:35 compute-1 podman[197429]: time="2025-11-24T13:57:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 13:57:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:57:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 24 13:57:35 compute-1 podman[197429]: @ - - [24/Nov/2025:13:57:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 24 13:57:36 compute-1 nova_compute[187078]: 2025-11-24 13:57:36.245 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:38 compute-1 nova_compute[187078]: 2025-11-24 13:57:38.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:38 compute-1 nova_compute[187078]: 2025-11-24 13:57:38.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:38 compute-1 sshd-session[225551]: Accepted publickey for zuul from 192.168.122.10 port 36898 ssh2: ECDSA SHA256:kJaEBOMeTJvARgbgWpeFndaFN7gWW4nOgnARECFnz7w
Nov 24 13:57:39 compute-1 systemd-logind[815]: New session 52 of user zuul.
Nov 24 13:57:39 compute-1 systemd[1]: Started Session 52 of User zuul.
Nov 24 13:57:39 compute-1 sshd-session[225551]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:57:39 compute-1 sudo[225555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 24 13:57:39 compute-1 sudo[225555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:57:39 compute-1 nova_compute[187078]: 2025-11-24 13:57:39.967 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:41 compute-1 nova_compute[187078]: 2025-11-24 13:57:41.246 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:41 compute-1 nova_compute[187078]: 2025-11-24 13:57:41.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:41 compute-1 nova_compute[187078]: 2025-11-24 13:57:41.668 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:43 compute-1 ovs-vsctl[225726]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 24 13:57:44 compute-1 virtqemud[186628]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 24 13:57:44 compute-1 virtqemud[186628]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 24 13:57:44 compute-1 virtqemud[186628]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 13:57:44 compute-1 nova_compute[187078]: 2025-11-24 13:57:44.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:44 compute-1 nova_compute[187078]: 2025-11-24 13:57:44.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 13:57:44 compute-1 nova_compute[187078]: 2025-11-24 13:57:44.667 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 13:57:44 compute-1 nova_compute[187078]: 2025-11-24 13:57:44.680 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 13:57:44 compute-1 nova_compute[187078]: 2025-11-24 13:57:44.968 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:45 compute-1 crontab[226146]: (root) LIST (root)
Nov 24 13:57:45 compute-1 nova_compute[187078]: 2025-11-24 13:57:45.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.248 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:46 compute-1 podman[226221]: 2025-11-24 13:57:46.519228901 +0000 UTC m=+0.062606285 container health_status 952093d9acfb0eef07882df9665962a1f26357e91ccb47c13dba1ac04ed9859b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal)
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.666 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.666 187082 DEBUG nova.compute.manager [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.667 187082 DEBUG oslo_service.periodic_task [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.693 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.693 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.826 187082 WARNING nova.virt.libvirt.driver [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.827 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5480MB free_disk=73.34721374511719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.827 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.827 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.897 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.898 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.926 187082 DEBUG nova.compute.provider_tree [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed in ProviderTree for provider: ece8f004-1d5b-407f-a713-f9e87706b045 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.936 187082 DEBUG nova.scheduler.client.report [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Inventory has not changed for provider ece8f004-1d5b-407f-a713-f9e87706b045 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.938 187082 DEBUG nova.compute.resource_tracker [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 13:57:46 compute-1 nova_compute[187078]: 2025-11-24 13:57:46.938 187082 DEBUG oslo_concurrency.lockutils [None req-6640a862-9e10-4ba1-ab08-ef5b9e3256e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 13:57:47 compute-1 systemd[1]: Starting Hostname Service...
Nov 24 13:57:47 compute-1 systemd[1]: Started Hostname Service.
Nov 24 13:57:49 compute-1 openstack_network_exporter[199599]: ERROR   13:57:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 24 13:57:49 compute-1 openstack_network_exporter[199599]: ERROR   13:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:57:49 compute-1 openstack_network_exporter[199599]: ERROR   13:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 24 13:57:49 compute-1 openstack_network_exporter[199599]: ERROR   13:57:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 24 13:57:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:57:49 compute-1 openstack_network_exporter[199599]: ERROR   13:57:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 24 13:57:49 compute-1 openstack_network_exporter[199599]: 
Nov 24 13:57:49 compute-1 nova_compute[187078]: 2025-11-24 13:57:49.970 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 13:57:51 compute-1 nova_compute[187078]: 2025-11-24 13:57:51.249 187082 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
